What you need to know
- The EU appears set to unveil some very controversial legislation to try and stop the spread of child sexual abuse material.
- A leaked new proposal appears to mandate the scanning of messages for CSAM content.
- The provision appears to include encrypted messages and measures for detecting grooming that could extend beyond images to text as well.
- WhatsApp chief Will Cathcart said the move "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk."
Update: 5/11 6:38 am ET: The EU has now published its proposals to fight child sexual abuse, including the document as leaked below.
A new leaked document appears to reveal that the EU is planning to mandate that the providers of messaging services like WhatsApp and iMessage must scan messages in order to detect child sexual abuse material (CSAM) and grooming of children within messages.
The legislation includes "uniform obligations, applicable to all providers of hosting or interpersonal communication service offering such services in the EU's digital single market, to perform an assessment of risks of misuse of their services for the dissemination of known or new child sexual abuse material or for the solicitation of children (together defined as 'online child sexual abuse')" as well as more targeted obligations for certain providers "to detect such abuse, to report it via the EU Centre, to remove or disable access to, or to block online child sexual abuse material when so ordered."
The legislation appears to include statements that providers who use end-to-end encryption technology (like WhatsApp):
Muffett described the paragraph as "We want a backdoor, but we don't want just anyone to be able to use it. Only us good guys."
Professor of cryptography at John Hopkins, Matthew Green, stated "Speaking of actual free speech issues, the EU is proposing a regulation that could mandate scanning of encrypted messages for CSAM material. This is Apple all over again," in reference to Apple's own CSAM scanning debacle last year that would have seen Apple check iCloud images for CSAM hashes, a measure possibly taken in anticipation of legislation like the one the EU appears to be working on.
Green said the document "is the most terrifying thing I've ever seen" and that the EU was "proposing a new mass surveillance system that will read private text messages." This is a reference to provisions beyond CSAM detection that will try to detect the "grooming" of children in messages, the implications of which seem to be clear:
Let me be clear what that means: to detect “grooming” is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table.
It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale.Let me be clear what that means: to detect “grooming” is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table.
It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale.— Matthew Green (@matthew_d_green) May 10, 2022May 10, 2022
Green said the measure was " the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR."
Will Cathcart, head of WhatsApp, which is one of the best iPhone apps for messaging on devices like the iPhone 12 and iPhone 13, said the plans were "incredibly disappointing" because they fail to protect end-to-end encryption. Cathcart says the measures "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk" and that mandating the proposed system "built for one purpose in the EU" could be used to undermine human rights "in many different ways globally."
"We look forward to working with the EU to inform the legislative process on how we ensure the safety of children, both offline and online," a Meta spokesperson told iMore. "We lead the industry in developing new ways to keep children safe online by preventing, detecting, and responding to abuse. Our focus is on preventing harm from happening in the first place by restricting adults from messaging teens they're not connected with and using the information available to us to identify potentially harmful activity. It's important that any measures adopted do not undermine end-to-end encryption which protects the safety and privacy of billions of people, including children."
The document is reportedly set to be officially unveiled on May 11.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9