Skip to main content

EU plans to mandate scanning of encrypted messages to stop child sexual abuse

Whatsapp Message Hero
Whatsapp Message Hero (Image credit: Luke Filipowicz / iMore)

What you need to know

  • The EU appears set to unveil some very controversial legislation to try and stop the spread of child sexual abuse material.
  • A leaked new proposal appears to mandate the scanning of messages for CSAM content.
  • The provision appears to include encrypted messages and measures for detecting grooming that could extend beyond images to text as well.
  • WhatsApp chief Will Cathcart said the move "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk."

Update: 5/11 6:38 am ET: The EU has now published its proposals to fight child sexual abuse, including the document as leaked below.

A new leaked document appears to reveal that the EU is planning to mandate that the providers of messaging services like WhatsApp and iMessage must scan messages in order to detect child sexual abuse material (CSAM) and grooming of children within messages.

The document, shared online by Alec Muffett, states that the voluntary actions of platforms alone "have proven insufficient" to combat child sexual abuse online, and states:

The proposed Regulation consists of two main building blocks: first, it imposes on providers obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges, and, second, it establishes the EU Centre on Child Sexual Abuse as a decentralised agency to enable the implementation of the new Regulation.

The legislation includes "uniform obligations, applicable to all providers of hosting or interpersonal communication service offering such services in the EU's digital single market, to perform an assessment of risks of misuse of their services for the dissemination of known or new child sexual abuse material or for the solicitation of children (together defined as 'online child sexual abuse')" as well as more targeted obligations for certain providers "to detect such abuse, to report it via the EU Centre, to remove or disable access to, or to block online child sexual abuse material when so ordered."

The legislation appears to include statements that providers who use end-to-end encryption technology (like WhatsApp):

Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-toend encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users

Muffett described the paragraph as "We want a backdoor, but we don't want just anyone to be able to use it. Only us good guys."

Professor of cryptography at John Hopkins, Matthew Green, stated "Speaking of actual free speech issues, the EU is proposing a regulation that could mandate scanning of encrypted messages for CSAM material. This is Apple all over again," in reference to Apple's own CSAM scanning debacle last year that would have seen Apple check iCloud images for CSAM hashes, a measure possibly taken in anticipation of legislation like the one the EU appears to be working on.

Green said the document "is the most terrifying thing I've ever seen" and that the EU was "proposing a new mass surveillance system that will read private text messages." This is a reference to provisions beyond CSAM detection that will try to detect the "grooming" of children in messages, the implications of which seem to be clear:

See more

Green said the measure was " the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR."

Will Cathcart, head of WhatsApp, which is one of the best iPhone apps for messaging on devices like the iPhone 12 and iPhone 13, said the plans were "incredibly disappointing" because they fail to protect end-to-end encryption. Cathcart says the measures "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk" and that mandating the proposed system "built for one purpose in the EU" could be used to undermine human rights "in many different ways globally."

"We look forward to working with the EU to inform the legislative process on how we ensure the safety of children, both offline and online," a Meta spokesperson told iMore. "We lead the industry in developing new ways to keep children safe online by preventing, detecting, and responding to abuse. Our focus is on preventing harm from happening in the first place by restricting adults from messaging teens they're not connected with and using the information available to us to identify potentially harmful activity. It's important that any measures adopted do not undermine end-to-end encryption which protects the safety and privacy of billions of people, including children."

The document is reportedly set to be officially unveiled on May 11.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.

6 Comments
  • Hooray for Socialism, where everything is for “the good of the state”, but there are no individual rights. 🙄 I have a suggestion. Apple, Google, Samsung etc. should just stop doing business in the Socialist Union. The resulting riots would make these pin heads realize how stupid these ideas are. Or at least, all companies involved should “respectfully decline” to implement this nightmare spying. What is the Socialist Union going to do? Throw a Hissy Fit? Stop trying to regulate everything. Stop trying to change the way successful companies do business. You are foolish politicians. You have NO idea how to run a business.
  • I agree. Why are we letting the EU dictate everything over obviously freer countries like the US. This is just another case of small man's syndrome.
  • Before the EU decided to get involved with any CSAM content, which company wanted to put CSAM scanning software on every iOS device? Yep, you guessed it. Apple was wanting to install CSAM detection software onto every Apple device last year. Apple did put the next best thing onto every iOS device, and that is there Nudity detection software onto every iOS device, which scans for nudity images on any iOS device, and it also checks to see if their is any nudity in any messages as well. Apple added this nudity detection software under the guise that it will only be used by parents, and to be enabled only on their kids iOS devices. Yet that Nudity detection software is install onto every iOS device running the latest verion of iOS 15. So what's to stop Apple from now scanning for any specific child images, and enabling this scanning on any iOS customer that they want to scan? Nothing!
  • Don’t worry, it’s coming here soon.
  • delete this post
  • Well considering Apple already has their Nudity detection software on all iOS 15 devices. Which can already scan for any and all nude images, including scanning for any Nudity in any message apps as well. This Nudity detection software is already installed on every iPhone running the latest version of iOS 15. So it should take very little effort on Apple's part to now scan for any CSAM content, or images. Also don't forget that Apple wanted this CSAM scanning software on every iOS device before the EU got involved.