Apple Removes CSAM Search Mention from Child Safety Page; The code is in iOS

Apple has removed all references to its controversial child sexual abuse software (CSAM) feature, which was first announced in August, from a website about its child security features. The changes, which have been discovered by MacRumors, appear to have occurred between December 10th and December 13th. Despite the website change, the company said its ideas for the feature have not changed.

Two of the three security features, which were debated with iOS 15.2 earlier this week, are also on the page, titled “Expansion Protection for Kids.” However, references to finding a more controversial CSAM, which was suspended due to retirement from private advocates, have been dropped.

When it comes to comment, Apple spokesman Shane Bauer said the industry position has not changed since September, when it first announced that it would delay the launch of the CSAM search. “Based on feedback from clients, advocacy groups, researchers, and others, we plan to take additional time in the coming months to get input and make improvements before the release of these important child safety features. , ”The company’s September report read.

Essentially, Apple’s information does not say that the feature has been completely deleted. Documents explaining how the operation works are also available on Apple’s website. Apple’s CSAM search feature was controversial when it was announced because it just moved the hashes of iCloud Photos and compared them to a database of hashes of known child abuse images. Apple says this method allows users to report to authorities if they know they are uploading child pornography without compromising the privacy of its customers in general. It also states that encryption of user data is not affected and that the analysis is run on the device.

But critics argue that Apple’s system risks compromise Apple’s end-to-end encryption. Some refer to the program as the “back door” of governments around the world could Apple-arm be able to expand into with content beyond CSAM. For its part, Apple has stated that it will not “follow any government request to expand it” beyond CSAM.

While the CSAM search feature has not yet received a new launch date, Apple has continued to release two of the other child security features it announced in August. One is designed to warn children when they take nude pictures in Messages, while the other provides additional information when searching for rules related to child abuse through Siri, Spotlight, or Safari Search. Both rolled with iOS 15.2, which was released earlier this week and which obviously has allowed Apple to update its website.

We will be happy to hear your thoughts

      Leave a Reply

      AppleiPhonestop - Apple iPhone News and Rumours All Day
      Logo
      Enable registration in settings - general
      Compare items
      • Laptops (0)
      Compare
      0