CSAM monitoring references are quietly removed from apple.com

Previously, the company’s child security microsite described the company’s plans to scan the iPhone for child sexual abuse software, security features of messages, and warnings when someone searches for CSAM. The section on CSAM scans, on the other hand, has been removed… This means that Apple has abandoned the plan, and some speculate that people simply want to forget about it all. However, while acknowledging this possibility in September, I also explained why I think it is not possible.

I can see two powerful steps forward for Apple. The first thing is to continue to postpone the launch indefinitely. As a result, the operation did not reverse all opposition to civil liberties, nor did it offend child protection groups by the U-turns. Whenever I ask a question, I can easily say that I am working on the development of additional security measures, and I hope that people will be overwhelmed with those questions.

I think it may work for a while, but it is not endless. At one point, the child safety team would like to get up and know when the program will boot. For example, Apple could not reach iOS 16 without launching this feature or abandoning its ideas, and it is unlikely to overcome it in the long run.

The second and best way is to announce a Facebook independent monitoring board. The mission of the commission is to approve the contents of all CSAM databases used by Apple around the world. The wise thing is that Apple invited CSAM critics to the panel, such as Matthew Green, a cybersecurity student.

I’m not sure if Apple will take my advice, but if the CSAM scan doesn’t work because I removed the idea from its website, I want more time for the company to plan options. I do not think just that.

Apple has quietly removed references to its controversial idea of ​​scanning users’ iPhones for advanced child pornography application (CSAM) from its website.

References of references to the idea were first reported by MacRumors. The program was finally halted after the technology giant received a wave of discipline from security experts and private advocates on its features, which included the search for user photos and messages for child sexual abuse software and applications sexual.

Essentially, Apple says that a user’s phone will scan photos synchronized with iCloud photos for known images of sexual abuse. In addition, the company said it would check the messages of users under the age of 18 for sexual content and would warn parents if that content was on devices for users under the age of 13.

Concerns are immediately expressed about how the technology could be used in the future and how it could affect children who do not have a good relationship with their parents. A team of more than 90 organizations told Apple that the technology could be used to “monitor the safety, security and safety of people around the world, and have catastrophic consequences for many children.”

We will be happy to hear your thoughts

      Leave a Reply

      AppleiPhonestop - Apple iPhone News and Rumours All Day
      Logo
      Enable registration in settings - general
      Compare items
      • Laptops (0)
      Compare
      0