In recent years, Apple’s upcoming iOS 15.2 update seems to be the most original iPhone update. It’s not all about new features or remarkable functions.
Despite countless warnings, it includes a ‘surprising’ change in direction for Apple and the company’s iPhone-billion-plus users.
You’ll recall the controversy surrounding Apple’s proposed customer monitoring team that screamed up to launch its iPhone 13 and iOS 15. Apple retires and the controversy is over, although Apple promises to comment on the many results it has received, and to come back, reaffirms its determination to continue at no cost.
And so, we are.
Apple’s child security ideas have two separate updates. First, to scan users’ photo libraries on their iPhones before synchronizing with iCloud, is using AI to match user photos against government-sanctioned commercials, displays and traffic of child pornography. And second, to let parents enable Apple’s AI on their children’s iPhones to warn when nude or sexually explicit images are sent or received on iMessage.
Both proposals cause serious warnings on privacy and security complications. On the device (customer side) viewing your photo library breaks a risk – although not written – covenant, that your phone itself is not monitored in the same way that cloud storage can. But there is no real controversy surrounding the view of cloud photo storage for illegal content. That has become the norm. Changing iMessage is a much more serious argument, in that it does not have a very secure and clean breach technology on which iMessage is written.