Digital rights organization warns of iPhone photo scanning issues

Iphone 12 Pro Review
Iphone 12 Pro Review (Image credit: Daniel Bader / Android Central)

What you need to know

  • A digital rights organization has warned of the issues with Apple's new safety measures.
  • It says the company has created a backdoor into its data storage and messaging system.

Digital rights organization EFF has warned of potential issues surrounding Apple's recently-announced plans to scan iCloud photos for images of Child Sexual Abuse Material.

The organization stated:

Apple has announced impending changes to its operating systems that include new "protections for children" features in iCloud and iMessage. If you've spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.

Whilst the foundation says child exploitation is a "serious problem", it says Apple is bending its privacy stance on the matter, and that it has created a backdoor to user privacy:

To say that we are disappointed by Apple's plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple's compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company's leadership in privacy and security.

The article misunderstands some of the issues at play, stating falsely that Apple "is planning to install" new features on every Apple device. However, it notes that Apple's new measures take place on the device and says:

We've said it before, and we'll say it again now: it's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses.

The company says that "people have the right to communicate privately without backdoors or censorship" even as minors, and that Apple should not follow through with its plans. You can read the full report here.

Apple recently announced new measures that will include scanning for photos uploaded to iCloud to check for the hashes of known images depicting child abuse against an existing database, plans which have been met with very mixed reactions and have been vocally criticized by some security experts and privacy advocates, including Edward Snowden.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9