Former Facebook security head says 'no easy answers' on Apple Child Safety

Facebook
Facebook (Image credit: iMore)

What you need to know

  • Facebook's Former head of security has weighed in on Apple's new Child Safety Measures.
  • Alex Stamos is now a professor of cybersecurity at Stanford.
  • He says there are no easy answers to the discussion of the policy and urged people that nuance was okay.

Alex Stamos, former head of Facebook Security and now a cybersecurity professor at Stanford says there are no easy answers to discussions around Apple Child Safety, a series of new measures announced last week that have generated some controversy.

Taking to Twitter Stamos said:

In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.Nuanced opinions are ok on this.

Stamos, whilst happy "to see Apple finally take some responsibility for the impacts of their massive communication platform," said he was "frustrated" with Apple for "moving the ball forward technically while hurting the overall effort to find policy balance."

Stamos urged people not to minimize the impact of abuse on children, and not to dismiss child safety as a reason for enacting such a policy and implementing new technology. He also criticized language in a leaked internal memo from the NCMEC to Apple employees that described opponents of the scheme as "screeching voices of the minority."

Stamos explained that he would prefer Apple create a robust iMessage reporting system, rather than machine learning, and said that Apple's CSAM scanning on-device didn't make sense unless it was preparing to encrypt iCloud backups. On CSAM scanning he added:

In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won't provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9

1 Comment
  • What truly kills me at the end of the day Apple still has to continue to scan their iCloud image storage, especially when Apple users can still send images to their iCloud storage, via a web browser. Is Apple going to get rid of the web interface for iCloud. Not a chance. So Apple still has to scan all images on their iCloud servers, period. So it really begs the question as to why Apple is also introducing on-device scanning of child abusive materials in the first place. I can definitely see down the road, Apple or some governments from around the world wanting to scan for other content, that is not child abusive materials. Privacy is only a word that Apple likes to use, but in reality it means nothing today.