What you need to know
- Apple has confirmed that photos already uploaded to iCloud will also be checked against CSAM hashes when the feature goes live this fall.
- Existing photos will be checked over time, although it isn't clear how long it will take.
Apple has confirmed that its controversial new CSAM detection system will check for known child abuse imagery in any image that is already in iCloud — not just new images as they are uploaded.
During media briefings aimed at clarifying how the new CSAM system works, Apple confirmed that any photo already in iCloud will be checked against known CSAM hashes over time, although it isn't clear how long that time will be. The confirmation comes as some wondered whether the checks would only be carried out against new images as they were uploaded to iCloud Photos.
Apple recently published a new CSAM FAQ in an attempt to allay fears that the child abuse checking system could be used by governments and outer agencies in any nefarious ways. Apple maintains that isn't possible, saying that it will refuse any demands to tweak CSAM hash checking so it can also check for other things — including propaganda, anty-government memes, and more.
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future
iMore has also published a CSAM FAQ to help clear matters up and it's worth a read for anyone who continues to be confused or have concerns about the system and its intentions.