Apple Privacy Day Privacy LogoSource: Apple

What you need to know

  • Erik Neuenschwander, Apple's head of privacy, sat down for a new interview.
  • The executive talked about the company's new features that scan for Child Sexual Abuse Material (CSAM).

In a new interview with TechCrunch, Erik Neuenschwander, Apple's head of privacy, sat down to talk about the company's new child protection features that scan for Child Sexual Abuse Material (CSAM).

Apple had announced the new features, which scan iMessage, Siri, and iCloud Photos for collections of CSAM images, to some concern about what the features would mean for user privacy. In the new interview, Neuenschwander is attempting to dispek misconceptions about what the technology does and does not do.

When asked why Apple was introducing its new features that scan for CSAM now, Neuenschwander said that it was new technologies that allowed the company to balance child safety and user privacy.

"Why now comes down to the fact that we've now got the technology that can balance strong child safety and user privacy. This is an area we've been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users' libraries on cloud services that — as you point out — isn't something that we've ever done; to look through users' iCloud Photos. This system doesn't change that either, it neither looks through data on the device, nor does it look through all photos in iCloud Photos. Instead what it does is gives us a new ability to identify accounts which are starting collections of known CSAM."

Apple Child SafetySource: Apple

TechCrunch also asked if implementing such technology opened the door for outside agencies, like law enforcement, to ask Apple to scan for other things outside of CSAM. Neuenschwander pushed back on this, saying that the technology does not impede the security of the device or change Apple's stance on privacy when it comes to law enforcement or government interference.

"It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data. What we've designed has a device-side component — and it has the device-side component by the way, for privacy improvements. The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy."

You can read the full interview with at TechCrunch. If you want to learn more about Apple's new features for child safety protection, check out our FAQ.