What you need to know
- Apple has released a new document further detailing the security implications of its CSAM detection plans.
- Apple will publish details of the encrypted CSAM hash database that will be on all iPhones and iPads.
- Security researchers will be able to inspect the database to confirm its validity
Apple today released a new document that it hopes will go some way to allaying fears surrounding the security of its new CSAM detection system. The document carries the lofty title of "Security Threat Model Review of Apple's Child Safety Features" and is available on Apple's website (opens in new tab).
In the document, Apple explains that it will publish a further Knowledge Base article that will contain the root hash of the encrypted CSAM hash database that will itself be included in all versions of iOS and iPadOS. The theory is that security researchers will be able to compare the database on their devices with the one on Apple's servers, ensuring that it hasn't been meddled with in any way. That's just one of the methods Apple will now employ to ensure the database of CSAM being checked is legitimate.
Apple also says that the approach will allow third-party technical audits of its system, including the encrypted CSAM database.
In a media briefing, Apple further confirmed that the CSAM threshold at which photos will be flagged for manual review is 30. That means an iCloud account will need to have 30 pieces of CSAM content discovered before a manual review takes place. Apple says that the number was never intended to be kept private because security researchers would have been able to discover it regardless. It's also thought that while some may find the number 30 high, it's still very likely much lower than the number of images likely to be found in an offender's library.
The full document can be read on Apple's website and may go some way to making people more comfortable with the new CSAM detection system. Apple was also keen to point out that the current dialog surrounding the system is something that was built into the feature's timeline.
Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too.
Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.
There is a big flaw with this still. Its great that people, and security researchers can make sure that Apples CSAM hash file will be able to be verified, but whats to stop other files that have nothing to do with CSAM data or hashes being loaded on Apples iPhones at a later date, and then scan for any other content that is not CSAM related? Nothing!
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.