Apple confirms CSAM checks will be carried out on photos already in iCloud
What you need to know
- Apple has confirmed that photos already uploaded to iCloud will also be checked against CSAM hashes when the feature goes live this fall.
- Existing photos will be checked over time, although it isn't clear how long it will take.
Apple has confirmed that its controversial new CSAM detection system will check for known child abuse imagery in any image that is already in iCloud — not just new images as they are uploaded.
During media briefings aimed at clarifying how the new CSAM system works, Apple confirmed that any photo already in iCloud will be checked against known CSAM hashes over time, although it isn't clear how long that time will be. The confirmation comes as some wondered whether the checks would only be carried out against new images as they were uploaded to iCloud Photos.
Apple recently published a new CSAM FAQ in an attempt to allay fears that the child abuse checking system could be used by governments and outer agencies in any nefarious ways. Apple maintains that isn't possible, saying that it will refuse any demands to tweak CSAM hash checking so it can also check for other things — including propaganda, anty-government memes, and more.
iMore has also published a CSAM FAQ to help clear matters up and it's worth a read for anyone who continues to be confused or have concerns about the system and its intentions.
The CSAM functionality will be made live in the United States alone once iOS 15 and iPadOS 15 launch this fall, likely alongside the new iPhone 13 lineup.
Get the best of iMore in your inbox, every day!
Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too.
Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.
I guess Android phones with SD card slots will get a little more popular. Store only on device and back up to your own storage. I don’t have a problem with this, generally, but can false positives bring massive headaches to affected users, or cause their privacy to be violated by “follow up checks” to verify them? Sorry, but I have an issue with them being able to check every private memory captured against a database. This is really starting to get to unreasonable levels. I’ll likely drop my iCloud storage and shut off iCloud Photos, cause I really can’t be bothered to risk any inconvenience due to an algorithm. How transparent is this system even going to be. Will it flag immediately and allow you to just not upload those items. Will it allow appeals? Or will it flag on their end and give them the right to rummage through all of your **** looking for “evidence.” These algorithms can be like people making false accusations resulting in blanket search warrants on all your ****. Your own baby photos can result in your privacy being totally violated.