What you need to know
- Security researchers have found the source code for Apple's CSAM detection.
- Initial reports suggest that there may be flaws in the technology.
Reports indicate that Apple's CSAM technology may be flawed, after the code for system was allegedly found in iOS 14.
The Verge reports:
A Reddit user posted reverse-engineered coge allegedly for the new CSAM system stating "Believe it or not, this algorithm already exists as early as iOS 14.3, hidden under obfuscated class names. After some digging and reverse engineering on the hidden APIs, I managed to export its model (which is MobileNetV3) to ONNX and rebuild the whole NeuralHash algorithm in Python. You can now try NeuralHash even on Linux!"
According to Asuhariet Ygvar testing indicates the CSAM technology "can tolerate image resizing and compression, but not cropping or rotations". This is strange because of the technical assessments provided by Apple that state:
Another concern raised about the tech is collisions, where two different images generate the same hash, which could, in theory, be used to fool the system into detecting images that don't actually contain CSAM, however as The Verge explains this would require "extraordinary efforts to exploit" and wouldn't get past Apple's manual review process:
Ygvar said they hoped that the source code would help researchers "understand NeuralHash algorithm better and know its potential issues before it's enabled on all iOS devices."
In reponse to these revelations, Apple told iMore that the respresentation of reverse-engineering in this instance isn't accurate, and that the company has designed its NeuralHash algorithm to be publicly available so that security researchers can investigate it. It also states that the version being analyzed in the story is a generic version of its NeuralHash technology and not the final version coming to detect CSAM in iCloud photos. Apple says perceptual hashes by definition can be fooled into thinking that two different images are the same, and that CSAM scanning's security takes this into account. Apple also states that collisions are also expected and don't undermine the security of the system. For starters, the on-device CSAM hash database is encrypted, so it wouldn't be possible for an attacker described above to generate collissions against known CSAM. Apple further notes that when the CSAM threshold is crossed a second independent perceptual hash algorithm analyzes photos matched against known CSAM. This second algorithm is run server-side and wouldn't be available to attackers. From Apple (opens in new tab):
This safeguard is key in ensuring that your account can't be flagged because of images that don't contain CSAM, but might trigger an alert because the hashes match.
Finally, Apple again emphasized that its CSAM detection is subject to human review, such that even if the right amount of collisions trigger an alert the process is subject to human review that could identify "collisions" should your account be falsely flagged because you had been sent images with hatches that matched that CSAM database but were in fact not CSAM material.
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.
MobileNetV3 is from Google, which they open sourced it. So Apple's CSAM image file used that specific model. So then Apple wrote some background code to perform object detection on the customer's iPhone images, when that customer's decides to upload any images to their iCloud storage.
Get the best of iMore in in your inbox, every day!
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.