What you need to know
- Apple unveiled controversial new Child Safety measures last week.
- The company has now posted an FAQ to tackle some questions and concerns about the policy.
- In particular it seems keen to address the difference between CSAM scanning and a new communication safety feature in Messages.
Apple has published a new FAQ to tackle some of the questions and concerns raised regarding its new Child Safety measures announced last week.
The new FAQ (opens in new tab) accompanies a series of technical explanations about recently-announced technology that can detect Child Sexual Abuse Material uploaded to iCloud Photos, as well as a new feature that can use machine learning to identify if a child sends or receives sexually explicit images. In the new FAQ Apple appears particularly concerned with establishing the difference between these two new feautures, a snippet:
Questions also cover concerns about Messages being shared with law enforcement, if Apple is breaking end-to-end encryption, CSAM images, scanning photos, and more. It also addresses whether CSAM detection could be used to detect anything else (no), and whether Apple would add non-CSAM images to the technology at the behest of a government:
Apple's plans have raised concerns in the security community and have generated some public outcry from public figures such as NSA Whistleblower Edward Snowden.
Get the best of iMore in in your inbox, every day!
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9
The problem is they've already created the tool. Doesn't matter now what Apple says. This will be implemented by other governments and authorities regardless what Apple says. If Pegasus has shown us anything this will be exploited. Oh and before all the holier than thou come out "Think about the Children". Grow up, anyone doing that kind of stuff will and has now moved off any Apple products. They haven't solved anything in doing this. What they have done though is shown that they can and will put a man in the middle attack to "Do the Right thing". Sorry Dat ain't gonna fly.
The part that really gets me is Apple saying that only images that get uploaded to Apples iCloud will be get checked for any bad images. Why even check for images on-device, if only the images sent to the users iCloud will be checked out with those CSAM hashed images on-device? Also Apple calls it neuralMatch, so clearly it is using AI/ML to see if there is any of the 200,000 s-e.x abuse images by NCMEC (hashes) that matches any of the users photos. That is a lot of images for any computer, especially a mobile device to run through. Maybe that is why Apple is extending the battery life on the newer iPhone 13's. So more background processing can be peformed on-device (iPhones).
There is no way this is gonna happen. This will be quietly dropped. While this sounds like a noble idea, the fact that all of your pics are going to be scanned (for whatever "good reason") is just creepy. The bad press that Apple is getting for this is not going to go away. You can't advertise "Privacy is #1" while looking at everyone's pictures. This is a totally self-inflicted PR nightmare for Apple. Let me add that I am an Apple fan. I have a Mac Mini, multiple iPads and multiple iPhones. This is by far the most bizarre - and COMPLETELY unnecessary - "PR suicide" I have ever seen by a major company.
Apple has a terrible track record when it comes to refusing the demands of the Chinese government. They will knuckle under just like they always do and explain it away by claiming that they are just complying with local laws.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.