Apple’s child abuse detection program, a new warning has just been released for more than a billion iPhone and iPad users after private fears arose.
A new report says that basic flaws were found in Apple’s new Sexual Assault Search Application (CSAM), which the company plans to launch across all iPhones and iPads using iOS 115.
The search engine works by comparing the image hashes of images shared between iOS users to databases provided by child security agencies.
If matches are found then the authorities have been notified.
However, a team of researchers at Imperial University London have found that the entire CSAM search system can be passed by using a hashtag filter to an image.
The filter sends another hashtag to the search system. Researchers found that this hack derailed 99.9% of the time.
Apple could increase the hash size to avoid the case, according to researchers, but that would increase the risk of false positives and more user data code into images, raising private concerns.
Given the security requirements, it is unclear exactly when Apple will launch its CSAM search program.
The company has postponed the launch of the program until at least 2022.
This is not the first time the ideas to make the program have been called into question, according to Forbes.
Edward Snowden opposed CSAM in the summer, saying he would “constantly reconsider what belongs to him, and what belongs to them.
Don’t go for less that your full potential.
“I can’t think of any other company that is proud, and openly, distributing spyware to its own devices,” he said.
“There is no end in sight to how much technology Apple is already establishing could be pushing, meaning that restriction is the only flexible industry policy than Apple, something that governments are well aware of.”