iOS 15.2’s photo-scanning child safety feature isn’t scanning your photos (yet)
[ad_1]
Back in August, Apple announced a pair of new features meant to protect iPhone users from child predators and reduce the use of Apple’s technology and services to spread Child Sexual Abuse Material (CSAM) and protect minors from unwanted sexual content.
There were two features announced: CSAM scanning of iCloud photos and a parental control feature to monitor sexual content in Messages. The CSAM scanning feature sparked so much controversy from privacy advocates that Apple eventually delayed the whole thing until later in the year while working on improving their solution to better protect users’ privacy.
The latest iOS 15.2 beta includes the second feature, but not the controversial first one. The CSAM photo-scanning feature would use on-device scanning on images in your Photos library that are uploaded to iCloud, checking for matches with a known database of child sexual abuse imagery. The result of the scan are kept private, even from Apple, until a certain threshold is passed, at which point the National Center for Missing and Exploited Children (NCMEC) would be contacted.
Apple argued that on-device matching, with only secure hashes passing up to the iCloud server, is actually more private and secure than the in-the-cloud processing most of its competitors do for this sort of material. But privacy advocates warned that any system which scans photos on your device lays the groundwork for that system to be abused by malicious actors or state agents.
Apple
The new feature in iOS 15.2 beta 2, on the other hand, is the less-controversial Conversation Safety feature for Messages. It, too, relies on on-device scanning of images, but it doesn’t match images to a known database and isn’t enabled unless a parent account enables it.
Once a parent account enables it for a child account, any images sent or received in Messages will be scanned for nudity and sexual content. If such an image is received, it will be blurred out, and the child will be warned with a popup that presents resources for what to do. The child then has the ability to view the photo or not.
As originally conceived, the parents of a child under 13 would automatically be notified if such an image was viewed. Critics pointed out that this could put some children in at risk, so Apple removed the automatic notification and instead gives children of any age the ability to message a trusted adult if they want to, separate from the decision to view the image.
This feature doesn’t scan any images if it is not explicitly enabled by the parent account, and no information of any kind is sent to Apple or any other third party.
But CSAM scanning is on the way. If Apple sticks to its schedule, it will reintroduce the photo-scanning feature in an upcoming beta (likely 15.3) and the controversy will start all over again. But for now, the Conversation Safety features in iOS 15.2 shouldn’t scare parents and might help kids make better decisions—all without Apple actually scanning their photos.
I have written professionally about technology for my entire adult professional life – over 20 years. I like to figure out how complicated technology works and explain it in a way anyone can understand.
[ad_2]
Source link