FacebookSource: iMore

What you need to know

  • Facebook's Former head of security has weighed in on Apple's new Child Safety Measures.
  • Alex Stamos is now a professor of cybersecurity at Stanford.
  • He says there are no easy answers to the discussion of the policy and urged people that nuance was okay.

Alex Stamos, former head of Facebook Security and now a cybersecurity professor at Stanford says there are no easy answers to discussions around Apple Child Safety, a series of new measures announced last week that have generated some controversy.

Taking to Twitter Stamos said:

In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.

Nuanced opinions are ok on this.

Stamos, whilst happy "to see Apple finally take some responsibility for the impacts of their massive communication platform," said he was "frustrated" with Apple for "moving the ball forward technically while hurting the overall effort to find policy balance."

Stamos urged people not to minimize the impact of abuse on children, and not to dismiss child safety as a reason for enacting such a policy and implementing new technology. He also criticized language in a leaked internal memo from the NCMEC to Apple employees that described opponents of the scheme as "screeching voices of the minority."

Stamos explained that he would prefer Apple create a robust iMessage reporting system, rather than machine learning, and said that Apple's CSAM scanning on-device didn't make sense unless it was preparing to encrypt iCloud backups. On CSAM scanning he added:

In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won't provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.