What you need to know
- Apple yesterday unveiled controversial new safety measures that include scanning photos in iCloud photos.
- Privacy and security experts including Edward Snowden have blasted the plans.
- Snowden said Apple was "rolling out mass surveillance to the entire world".
Security experts and privacy advocates including NSA whistleblower Edward Snowden have slammed Apple's controversial new plans to scan iCloud photos for images of child abuse.
Apple announced a trio of new child safety features for its platforms yesterday. New communications tools will use on-device machine learnings to scan Messages for sensitive content sent between children, and new Siri and Search updates will help children and parents "if they encounter unsafe situations", and will intervene when users try to search for topics related to Child Sexual Abuse Imagery (CSAM). Perhaps the biggest change, and the one proving the most divisive, is plans to use new cryptography applications to detect CSAM images in iCloud photos, from Apple:
To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.
Apple explains this is done on-device so that no one, including Apple, can see what the images are. The in-depth technical detail is very complex, but the overall premise is proving extremely controversial.
Aside from his own thoughts above, Snowden shared a plethora of objections to this new policy from other experts and privacy advocates.
Ross Anderson, professor of security engineering at the University of Cambridge told the Financial Times "It is an absolutely appalling idea because it is going to lead to distributed bulk surveillance of . . . our phones and laptops", the same report cites another security researcher and privacy campaigner who said the move was a "huge and regressive step for individual privacy."
Cryptography professor Matthew Green, who initially leaked Apple plans prior to the announcement stated " why would Apple spend so much time and effort designing a system that is specifically designed to scan images that exist (in plaintext) only on your phone — if they didn't eventually plan to use it for data that you don't share in plaintext with Apple?"
There are some noted caveats, however. As expected, if a user disables iCloud Photos, Apple cannot scan images that aren't stored in iCloud Photos, meaning anyone can theoretically "opt-out" by choosing not to use this issue. Apple also fervently states that the system is designed to protect user privacy and has less than a one in a trillion chance per year of incorrectly flagging a user's account. As other commenters are noted, the fact this measure is only applied to iCloud photos makes it no different from other services already available: