Apple software chief admits child protection measures have been 'widely misunderstood'

Craig Federighi Wwdc 2020 Privacy
Craig Federighi Wwdc 2020 Privacy (Image credit: Apple)

What you need to know

  • Apple announced a series of controversial child protection measures last week.
  • The company's head of software, Craig Federighi, has admitted the measures have been "widely misunderstood".
  • He told The Wall Street Journal Apple wished the measures had come out "a little more clearly."

Apple's head of software, Craig Federighi, has told The Wall Street Journal's Joanna Stern that the company's new Child Safety measures have been "widely misunderstood.

In an interview, Federighi said that Apple wished the measures had come out a little more clearly, following a wave of controversy and adverse reaction to the measures. Federighi told Joanna Stern that "in hindsight", announcing its new CSAM detection system and a new Communication Safety feature for detecting sexually explicit photos at the same time was "a recipe for this kind of confusion."

Federighi says that "it's really clear a lot of messages got jumbled pretty badly" in the wake of the announcement.

On the idea that Apple was scanning people's phones for images, Federighi said "this is not what is happening." He said, "to be clear we're not actually looking for child pornography on iPhones... what we're doing is finding illegal images of child pornography stored in iCloud". Noting how other cloud providers scan photos in the cloud to detect such images, Federighi said that Apple wanted to be able to detect this without looking at people's photos, doing it in a way that is much more private than anything that has been done before.

Federighi stated that "a multi-part algorithm" that performs a degree of analysis on-device so that a degree of analysis can be done in the cloud relating to detecting child pornography. Federighi did in fact state the threshold of images is "something on the order of 30 known child pornographic images," and that only when this threshold is crossed does Apple know anything about your account and those images and not any other images. He also reiterated Apple isn't looking for photos of your child in the bath, or pornography of any other sort.

Pressed about the on-device nature of the feature Federighi said it was a "profound misunderstanding", and that CSAM scanning was only being applied as part of a process of storing something in the cloud, not processing that was running over images stored on your phone.

On why now, Federighi said that Apple had finally "figured it out" and had wanted to deploy a solution to the issue for some time, rather than suggestions Apple was caving to pressure from somewhere else. Federighi also said CSAM scanning was in "no way a back door" and that he didn't understand that characterization. He noted that if someone was scanning for images in the cloud that no one could know what they were looking for, but that this was an on-device database that shipped to all devices in all countries regardless of location. And that if people weren't confident that Apple would say no to a government, he assured people that there were multiple levels of audibility such that Apple couldn't get away with trying to scan for something else.

Federighi also explained Communication Safety in Messages, which uses machine learning to detect sexually explicit images in photos received by children. He also clarified the tech was "100% different" from CSAM scanning. He also said Apple was confident in the tech, but that it could mistakes, although Apple had a hard time coming up with images to fool the system.

Apple's new measures have raised ire with some privacy advocates and security experts.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9

5 Comments
  • It was pretty clear to me that certain people had an agenda and wanted to cloak that as misunderstanding what Apple was doing. Because Apple was pretty clear the whole time what they were doing I never had any confusion about it and I'm no genius.
  • "Federighi said that Apple wanted to be able to detect this without looking at people's photos, doing it in a way that is much more private than anything that has been done before." Total BS, especially when Apple still has to scan on their existing iCloud servers for that child content. Especially since Apple currently supports iCloud web portal via browser. Existing Apple customers can send images to Apples iCloud storage, so Apple still has to scan from their own servers. So its a lie, and a front that Apple is looking to scan on-device to make it more private. If anyone believes that, then I have some really great beach front property at the North Pole for only a $1000..
  • Apple refused to implement FBI requested backdoors because the government was trying to force them into altering their OS in the way that would make it less secure. This defense clearly worked in court. Now, they are adding a feature that can be used to scan all local user files and compare them with any kind of hashes, and all that it takes is a simple change in settings. So no way this defense could work any more. This new framework opens the doors to all kinds of abuse of power and unwarranted mass surveillance. Of course we are all idiots who “misunderstood” that Apple has pinky swore to never let the governments abuse their new feature.
  • Exactly. Before they has semi-decent excuse for the government what they can't access the user's data because *tech-blah-blah-architecture*. Now everyone knows it's not true, and they CAN DEMAND cooperation (since it's possible). Tomorrow Russian or Chinese (for example) government can ask to scan cloud photos for faces of opposition leaders, and hand back the info of the users who have it. That's it. Apple can't ******** its way out of it. And Apple can't negotiate anything since they want to have business in the country demanding it. And they will demand it soon.
  • The problem isn’t what Apple is scanning for now. It’s what they will be scanning for in the future on the behalf of governments, foreign and domestic. Heck, whatever Apple feels like calling “hate” in the future will be fair game. We’ve already seen it. They shut down Parler because the owners didn’t pull violent content. That same content is on Twitter all day everyday.