What you need to know
- Epic Games CEO Tim Sweeney has joined the chorus of criticism in the wake of Apple's Child Safety announcement.
- Sweeney criticized iCloud for taking user data by default and hiding settings.
- He also claimed Apple's new policy meant Apple would soon let them scan emails and report receivers of illegal content to the police.
Epic Games CEO Tim Sweeney has joined the chorus of criticism over Apple's new Child Safety measures, but also shared misleading information about Apple's plans.
In the wake of Apple's announcement Sweeney, whose company is locked in a legal battle with Apple over its App Store and the iOS ecosystem, took to Twitter to criticize Apple's iCloud platform:
It's atrocious how Apple vacuums up everybody's data into iCloud by default, hides the 15+ separate options to turn parts of it off in Settings underneath your name, and forces you to have an unwanted email account. Apple would NEVER allow a third party to ship an app like this.
Sweeney said he had confirmed that he was unable to delete his iCloud email address without deleting your entire Apple ID, expressing his frustration at the domain:
Sweeney's comments were made following Apple's recently announced Child Safety measures, which include scanning photos uploaded to iCloud against a hash database of known CSAM material provided by the NCMEC and other organizations that protect children.
However, in noting one possible solution to his iCloud issue Sweeney claimed Apple's new policy would somehow also include scanning emails for illegal content and reporting the receiver to the police. When told about the toggle to turn off Mail in iCloud settings, Sweeney replied:
Yes, that prevents your phone from syncing your forced iCloud email account, but it's still there, accumulating spam and god knows what else - if an adversary emails you something awful, then soon Apple's new policy will be to scan it and report you to the police.
When asked in a reply "so if someone emails me something illegal they will report ME to the police?" Sweeney replied "Yes", linking to an article from the Electronic Frontier Foundation regarding Apple's plans.
The linked article doesn't mention email anywhere, neither does Apple's CSAM scanning policy, which pertains only to images uploaded by users to iCloud, with Apple having confirmed to numerous outlets switching off the service means Apple can't scan for CSAM hashes. From John Gruber's Daring Fireball:
The CSAM detection for iCloud Photo Library is more complicated, delicate, and controversial. But it only applies to images being sent to iCloud Photo Library. If you don't use iCloud Photo Library, no images on your devices are fingerprinted. But, of course, most of us do use iCloud Photo Library.
As The Verge notes Apple and other email, providers have used hashes to scan for CSAM images in emails for years, so it is unclear exactly what Sweeney is referring to here.
According to the new policy, in this scenario, any illegal content sent from an "adversary" would have to be opened and saved by the user and uploaded to iCloud Photos before it was flagged by Apple's system. Furthermore, this wouldn't lead directly to Apple reporting someone to the police, as the CSAM system has a threshold (the exact number isn't public) of CSAM flags that must be reached before the flags are reviewed by Apple to ensure reporting accuracy.
Tim Sweeney says he plans to share "some very detailed thoughts" on Apple's new Child Safety Policy later.
On Thursday Apple also announced a new Messages feature that uses machine learning to analyse content sent over Messages to children in a shared family iCloud account, and new search guidance for Siri and Search that can identify if a user tries to search for CSAM material.