Apple backed out of a controversial child protection feature and now we know why

Organizing photos on iPhone and iPad
(Image credit: iMore)

Apple introduced new child safety protections to help detect known child sexual abuse material (CSAM) in August of 2021. Little more than a year later, it backed down on its CSAM-scanning plans -- and now we know why.

Part of Apple's child protection initiative was to identify known CSAM material before it was uploaded to iCloud Photos, but that proved controversial among privacy advocates who worried that a precedent was being set and that the technology could be misused. It appears that Apple ultimately agreed.

Apple is again being taken to task over its lack of CSAM iCloud protections and that's triggered the company to explain its thinking, saying that “scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit."

Privacy first

A child safety group known as Heat Initiative recently sent a letter to Apple CEO Tim Cook to demand the company does more to protect children who are abused, according to a Wired report. Cook didn't respond, but Erik Neuenschwander, Apple's director of user privacy and child safety, did.

The response was also provided to Wired which means we now get a better understanding of Apple's thinking behind its decision to ditch the scanning of CSAM content — despite many misunderstanding how it would actually work.

Alongside creating new thread vectors, Neuenschwander argues that scanning iCloud data "would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

Apple ultimately chose privacy over trying to prevent known CSAM material from being stored on its servers, it seems. “We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago,” Neuenschwander reportedly said in the response to Heat Initiative. “We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.”

Heat Initiative chief Sarah Gardner called Apple's decision "disappointing," telling Cook that “Apple is one of the most successful companies in the world with an army of world-class engineers,” Gardner wrote in a statement to WIRED. “It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better.”

Oliver Haslam
Contributor

Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too. Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.