Skip to main content

Apple publishes Child Safety FAQ to address CSAM scanning concerns and more

Apple Child Safety
Apple Child Safety (Image credit: Apple)

What you need to know

  • Apple unveiled controversial new Child Safety measures last week.
  • The company has now posted an FAQ to tackle some questions and concerns about the policy.
  • In particular it seems keen to address the difference between CSAM scanning and a new communication safety feature in Messages.

Apple has published a new FAQ to tackle some of the questions and concerns raised regarding its new Child Safety measures announced last week.

The new FAQ (opens in new tab) accompanies a series of technical explanations about recently-announced technology that can detect Child Sexual Abuse Material uploaded to iCloud Photos, as well as a new feature that can use machine learning to identify if a child sends or receives sexually explicit images. In the new FAQ Apple appears particularly concerned with establishing the difference between these two new feautures, a snippet:

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it. The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages

Questions also cover concerns about Messages being shared with law enforcement, if Apple is breaking end-to-end encryption, CSAM images, scanning photos, and more. It also addresses whether CSAM detection could be used to detect anything else (no), and whether Apple would add non-CSAM images to the technology at the behest of a government:

Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC

Apple's plans have raised concerns in the security community and have generated some public outcry from public figures such as NSA Whistleblower Edward Snowden.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.

4 Comments
  • The problem is they've already created the tool. Doesn't matter now what Apple says. This will be implemented by other governments and authorities regardless what Apple says. If Pegasus has shown us anything this will be exploited. Oh and before all the holier than thou come out "Think about the Children". Grow up, anyone doing that kind of stuff will and has now moved off any Apple products. They haven't solved anything in doing this. What they have done though is shown that they can and will put a man in the middle attack to "Do the Right thing". Sorry Dat ain't gonna fly.
  • The part that really gets me is Apple saying that only images that get uploaded to Apples iCloud will be get checked for any bad images. Why even check for images on-device, if only the images sent to the users iCloud will be checked out with those CSAM hashed images on-device? Also Apple calls it neuralMatch, so clearly it is using AI/ML to see if there is any of the 200,000 s-e.x abuse images by NCMEC (hashes) that matches any of the users photos. That is a lot of images for any computer, especially a mobile device to run through. Maybe that is why Apple is extending the battery life on the newer iPhone 13's. So more background processing can be peformed on-device (iPhones).
  • There is no way this is gonna happen. This will be quietly dropped. While this sounds like a noble idea, the fact that all of your pics are going to be scanned (for whatever "good reason") is just creepy. The bad press that Apple is getting for this is not going to go away. You can't advertise "Privacy is #1" while looking at everyone's pictures. This is a totally self-inflicted PR nightmare for Apple. Let me add that I am an Apple fan. I have a Mac Mini, multiple iPads and multiple iPhones. This is by far the most bizarre - and COMPLETELY unnecessary - "PR suicide" I have ever seen by a major company.
  • Apple has a terrible track record when it comes to refusing the demands of the Chinese government. They will knuckle under just like they always do and explain it away by claiming that they are just complying with local laws.