What you need to know
- Jane Horvath, Apple's chief privacy officer spoke at CES yesterday.
- Horvath said Apple uses technology to screen for specific photos.
- Images found are "reported."
Jane Horvath, Apple's Senior Director of Global Privacy, spoke during a CES panel yesterday and confirmed that Apple scans photos uploaded to iCloud to ensure that they don't contain anything illegal.
The Telegraph reports that Horvath specifically mentioned child abuse, saying that Apple is "utilising some technologies to help screen for child sexual abuse material."
Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
While there will surely be people who have an issue with this, Apple isn't the first company to scan images in this way. Many companies use software called PhotoDNA – a solution that was specifically designed to help prevent child exploitation.
"By working collaboratively with industry and sharing PhotoDNA technology, we're continuing the fight to help protect children." – Courtney Gregoire, Assistant General Counsel, Microsoft Digital Crimes Unit
Now that's surely something we can all agree with.
We may earn a commission for purchases using our links. Learn more.