iCloud: Everything you need to know!Source: iMore

What you need to know

  • New internal messages reveal Apple knew it had a problem with child pornography in its ecosystem last year.
  • Anti-fraud chief Eric Friedman told a colleague Apple was "the greatest platform for distributing child porn."
  • He said it was because of Apple's intense focus on privacy.

New internal messages sent by Apple's anti-fraud chief reveals the company knew it had a problem with child porn distribution on its platform early last year.

Messages reported by The Verge sent by Eric Friedman state:

"The spotlight at Facebook etc is all on trust and safety (fake accounts, etc). In privacy they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc."

The conversation continues, in which Friedman shares a slide from an internal presentation that listed 'child predator grooming' as a known issue on the App Store and within iMessage. The revelation is of course pertinent because of Apple's recent attempt to tackle the distribution of CSAM on its platform, and child grooming through messages with a series of Child Safety measures. Apple plans to scan iCloud photos using a hashing system to detect whether images uploaded to iCloud match a database of known child sexual abuse material, a move some have criticized as invasive and a breach of user privacy. Apple is also going to use machine learning within the Messages app to detect when a child receives a sexually explicit image, warning the child that they don't have to open the image if they don't want to, and triggering a notification for a parent who is in an iCloud Family if the feature is turned on.