German politician calls on Apple to halt CSAM scanning plans

iCloud: Everything you need to know!
iCloud: Everything you need to know! (Image credit: iMore)

What you need to know

  • Apple's CSAM scanning plans continue to attract unwanted attention.
  • A German politician has written to Apple calling on the company to halt its plans.

Germany's Manuel Höferlin, the digital policy spokesman of the Free Democratic Party, has called on Apple to halt its plans to scan photos uploaded to iCloud for Child sexual abuse material.

As iFun reports:

Höferlin, who as an IT entrepreneur and chairman of the Bundestag's Digital Agenda Committee is technically firmly in the saddle, sees a great danger in Apple's plans.According to Höferlin, CSAM scanning, i.e. the local comparison of private photos with the digital fingerprints of a database of known abuse images, must be described as "the biggest breach of the confidentiality of communication that we have experienced since the invention of the Internet".

The FDP says that it welcomes Apple's attempt to tackle the problem of child exploitation, but is worried about the implementation:

Every scanned content destroys a piece of trust that users place in the fact that the content of their communication is not monitored unnoticed. The Internet without trusting communication is no longer a civilizational advance, but the greatest surveillance tool in history.

Höferlin invokes some of the arguments made against the practice by the Electronic Frontier Foundation, who has warned that Apple's CSAM scanning has opened a backdoor to iOS that could be exploited by governments.

The letter calls on Tim Cook directly to halt plans to scan for CSAM content stating:

This not only saves your own company many foreseeable problems but also protects the Achilles verses of the modern information society! Please remain with those who defend the civilizational achievement of a free network!

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9

3 Comments
  • No mention at all of the other services that already do these kind of checks when you upload content. It’s never been a problem until now apparently.
  • There is a huge difference between photo sharing sites (instagram, facebook, etc.) and someone scanning your private photos that are NOT being shared with the entire planet. Apple needs to drop this plan immediately. Yes, it sounds noble, but the idea that ALL of EVERYONE'S pictures are being scanned is just creepy. Particularly when Apple loudly advertises they are all about Privacy. Let's face it. This is an unbelievably bad idea, from a company that normally has unbelievably GREAT ideas. Not to mention, a company that we used to trust.
  • Maybe you need to read the process and technology document. Here you go,
    https:// www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf (take the space out after the //)
    The bottom line is that nothing leaves your device, unless, a match is found. Comparisons are made on device. Scanning isn't really the right term in any case. A hash is made of your photos, and if it matches a hash for known kiddy **** via a DB that is resident on your device, the fact that it matched is uploaded with your ****. Your photos are not 'scanned' on your device, or in iCloud. Nobody 'sees' your stuff, unless you upload bad stuff too often, and then only that stuff is accessible, for validation purposes. Everything is still as secure and private as it was, unless you upload your kiddy **** to iCloud. If you have a more reasonably thought out way of keeping this stuff from being passed around, feel free to chime in.
    (I'm leaving the censored stuff as is. You can figure it out)