More than 90 groups want Apple to drop plans to 'build surveillance capabilities' into devices

Craig Federighi Wwdc 2020 Privacy
Craig Federighi Wwdc 2020 Privacy (Image credit: Apple)

What you need to know

  • Apple has received an open letter from more than 90 policy groups demanding it drops plans to "build surveillance capabilities" into devices.
  • The letter is in response to Apple's CSAM efforts that involve checking on-device hashes of iCloud Photos images against known CSAM content.

A group of more than 90 international policy groups has banded together to deliver an open letter to Apple CEO Tim Cook that demands Apple ditches its plans to check iPhones and iPads for known CSAM content in iCloud Photos and inappropriate photos sent to and from kids.

In an announcement headed "International Coalition Calls on Apple to Abandon Plan to Build Surveillance Capabilities into iPhones, iPads, and other Products," the groups appear to have two main issues with Apple's new child safety plans.

In particular:

  • The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.
  • Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.

First reported by Reuters, the letter can be found in full online now, but it's important to note that both of the issues raised here have already been covered by Apple.

On the first issue, alerts will only be triggered based on images sent and received via the Messages app. No text messages will trigger any kind of alert whatsoever. Further, Apple also confirmed during press briefings that kids will be warned before any sort of parental notification is triggered. They'll need to expressly click through that warning to see the photo in question, having been told that their parents will be notified. Parents won't be notified of anything without a child's knowledge.

On the second issue, Apple has repeatedly said that it will not be swayed by governments and law enforcement if and when demands are made to use the CSAM detection system to detect other types of material. Apple also points to the fact the hashes to which iCloud Photos are matched are only provided by known child protection agencies. What's more, all of this is auditable, says Apple.

Despite this, the coalition believes that Apple will be "installing surveillance software" on iPhones — something Apple will no doubt strongly refute.

Oliver Haslam
Contributor

Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too. Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.