What you need to know
- WhatsApp's CEO has said the platform will not adopt a system like Apple's new child-safety measures that scans for CSAM content in iCloud Photos.
- Will Cathcart said the move was the wrong approach and a setback for people's privacy all over the world.
- Apple says Cathcart is wrong about some of the assertions made in his comments.
Will Cathcart, noting Apple's plans said that he was "concerned" by Apple's recently announced plans to scan photos uploaded to iCloud Photos for Child Sexual Abuse Material (CSAM), matching the hashes of photos to a known database of CSAM images provided by the NCMEC and other organizations that protect children.
Cathcart says Apple's plan is "the wrong approach and a setback for people's privacy all over the world," he further noted that people were asking if WhatsApp would adopt this system for WhatsApp, he said no.
Cathcart noted some of WhatsApp's own work in the sector and said that Apple "has long needed to do more to fight CSAM", however, criticized Apple's approach, which was announced earlier this week, stating:
Cathcart described the plan as "an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control." Cathcart also claimed that "countries, where iPhones are sold, will have different definitions on what is acceptable." Cathcart noted concerns about how the system could be used in China, and what might be considered illegal there.
Apple has confirmed to iMore that Cathcart's comments miss some of the key facts regarding its new system. Cathcart states that Apple "has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone." Apple denies this and says that its system can only detect CSAM images in iCloud Photos and that if a user turns off iCloud Photos that it simply won't work. It also can't detect any images that aren't known CSAM images, as noted by one of the project's technical validators:
The system is only designed to detect *known* CSAM images — it isn’t designed to detect images beyond those known CSAM images that are provided by the National Center for Missing & Exploited Children (NCMEC) and other child safety organizations.The system is only designed to detect *known* CSAM images — it isn’t designed to detect images beyond those known CSAM images that are provided by the National Center for Missing & Exploited Children (NCMEC) and other child safety organizations.— Benny Pinkas (@bennypinkas) August 5, 2021August 5, 2021
Cathcart further argued that "this is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control." As per the details of the announcement, Apple says that the CSAM image hashes come from the NCMEC, as well as other child safety organizations, not Apple, and that nowhere in the process can Apple add anything to the set of hashes it is given. As the unreadable hash list will be part of the operating system, every device has the same set of hashes. Furthermore, Apple refuted Cathcart's concern about other countries where iPhones are sold, stating that there's no way to modify or change the system for any specific region or device.
Apple's recently-announced measures have proven controversial, with some security experts and privacy advocates opposing the move. You can read the announcement from earlier this week here.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9