What you need to know
- Facebook's Former head of security has weighed in on Apple's new Child Safety Measures.
- Alex Stamos is now a professor of cybersecurity at Stanford.
- He says there are no easy answers to the discussion of the policy and urged people that nuance was okay.
Alex Stamos, former head of Facebook Security and now a cybersecurity professor at Stanford says there are no easy answers to discussions around Apple Child Safety, a series of new measures announced last week that have generated some controversy.
Taking to Twitter Stamos said:
Stamos, whilst happy "to see Apple finally take some responsibility for the impacts of their massive communication platform," said he was "frustrated" with Apple for "moving the ball forward technically while hurting the overall effort to find policy balance."
Stamos urged people not to minimize the impact of abuse on children, and not to dismiss child safety as a reason for enacting such a policy and implementing new technology. He also criticized language in a leaked internal memo from the NCMEC to Apple employees that described opponents of the scheme as "screeching voices of the minority."
Stamos explained that he would prefer Apple create a robust iMessage reporting system, rather than machine learning, and said that Apple's CSAM scanning on-device didn't make sense unless it was preparing to encrypt iCloud backups. On CSAM scanning he added:
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9
What truly kills me at the end of the day Apple still has to continue to scan their iCloud image storage, especially when Apple users can still send images to their iCloud storage, via a web browser. Is Apple going to get rid of the web interface for iCloud. Not a chance. So Apple still has to scan all images on their iCloud servers, period. So it really begs the question as to why Apple is also introducing on-device scanning of child abusive materials in the first place. I can definitely see down the road, Apple or some governments from around the world wanting to scan for other content, that is not child abusive materials. Privacy is only a word that Apple likes to use, but in reality it means nothing today.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.