Digital rights organization warns of iPhone photo scanning issues
What you need to know
- A digital rights organization has warned of the issues with Apple's new safety measures.
- It says the company has created a backdoor into its data storage and messaging system.
Digital rights organization EFF has warned of potential issues surrounding Apple's recently-announced plans to scan iCloud photos for images of Child Sexual Abuse Material.
The organization stated:
Whilst the foundation says child exploitation is a "serious problem", it says Apple is bending its privacy stance on the matter, and that it has created a backdoor to user privacy:
The article misunderstands some of the issues at play, stating falsely that Apple "is planning to install" new features on every Apple device. However, it notes that Apple's new measures take place on the device and says:
The company says that "people have the right to communicate privately without backdoors or censorship" even as minors, and that Apple should not follow through with its plans. You can read the full report here.
Apple recently announced new measures that will include scanning for photos uploaded to iCloud to check for the hashes of known images depicting child abuse against an existing database, plans which have been met with very mixed reactions and have been vocally criticized by some security experts and privacy advocates, including Edward Snowden.
Get the best of iMore in your inbox, every day!
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9
First of all it was already announced last year that all customer iCloud users images would be scanned for any child-p-or-n on Apples iCloud servers. This is only adding more scanning to it, as well as performing local, and on device scanning of a users iPhone, or iPad images without sending those images to Apples iCloud servers. Those image hashes that Apple will send to every iPhone, iPad, or other Apple devices, are actual inferenced images of actual naked body parts, and more. But ask yourself this, what is stopping Apple or the authorities that will have access to those hashed images, that can and will be sent to every Apple device, then whats to stop them from adding in other types of images. Like guns, bombs, drugs, or even some faces from some people of interest? Absolutely nothing! Apple is truly the modern day Orwell. "Apple privacy is a fundamental human right." Lol, yeah, right! What a joke. Apple sets up their customers with scare tactics, and fear mongering, and now implements this. I guess its true then, Apple loves to subjugate their customers.
This is a colossal mistake. Authoritarian governments will leap at the chance to order Apple to scan their citizen's phones for whatever the flavor of the month they are looking for happens to be.