Skip to main content

Priti Patel says Apple should see through CSAM photo scanning measures

iCloud: Everything you need to know!
iCloud: Everything you need to know! (Image credit: iMore)

What you need to know

  • Apple recently pumped the brakes on its Child Safety Measures.
  • UK Home Secretary Priti Patel has called on the company to see out the project.
  • It comes as the UK launches a new Safety Tech Challenge Fund.

UK Home Secretary Priti Patel has called on Apple to see through its Child Safety project, including a measure that would scan iCloud content for known Child Sexual Abuse Material.

It comes as the Home Office launched a new Safety Tech Challenge Fund. Patel stated:

It is utterly appalling to know that the sexual abuse of children is incited, organised, and celebrated online. Child abusers share photos and videos of their abhorrent crimes, as well as luring children they find online into sending indecent images of themselves.It is devastating for those it hurts and happens on a vast and growing scale. Last year, global technology companies identified and reported 21 million instances of child sexual abuse.

The Home Secretary says it is a misconception that abuse like this takes place in the "dark corners of the web", and said that end-to-end encryption presented a massive challenge for public safety:

End-to-end encrypted messaging presents a big challenge to public safety, and this is not just a matter for governments and law enforcement. Social media companies need to understand they share responsibility for keeping people safe. They cannot be passive or indifferent about what their products enable or how they might inadvertently blind themselves and law enforcement from protecting children with end-to-end encryption.

Patel also praised Apple's recent CSAM measures, which the company has put on hold following feedback from experts and privacy advocates. She called on the company "to see through that project", stating the tech's 1 in a trillion false positive rate meant "the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out."

Apple's CSAM scanning measures have drawn criticism from some because it takes place on-device, which some people consider to be intrusive, as noted Apple has now put the plans on hold.

Stephen Warwick
Stephen Warwick

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.

5 Comments
  • I don't care if false positives are 1 in 10 quadrillion. Scanning through anyone's photos is a violation of privacy. Period. Apple can't market themselves on their privacy if they literally scan every one of your photos.
  • Comparing the hash of your photos with known inappropriate material, and only when you upload them, is significantly different than scanning/looking through all your photos. It's like red light cameras that look at everyone, but only take pictures when you run the red light. Probably don't like those either, though. Every cloud storage service does some sort of CSAM analysis BTW. Most are required to, or do it because of the liability of not doing so.
  • I think more of the issue is not the cloud scanning, but the on-device scanning. I feel there has to be a better way to keep the predators from saving and sharing CSAM, apple just seemed to bungle their roll out from the start.
  • Is scanning through and recording all the places someone has traveled a violation of privacy? Is scanning through the list of people you know in your contacts and recording that a violation of privacy? Is scanning through your phone log for who you speak to most frequently and recording that a violation of privacy? As iMore pointed out very correctly, what is recorded virtually via certain smartphones would not be ok in the physical world. And I bet some of those who were loudest about being against Apple's CSAM hash comparing on phone were literally typing that stance against Apple's policy on an Android phone, a device that records "on phone" what is listed above and quite a bit more.
  • Patel is too much a politician. He strongly feels it is needed? Then the time to make that position loudly known was when the battle was raging. Instead he speaks up after the battle is over (at least for now), There is a legitimate claim regarding privacy and it is less to do with the specific process Apple was going to employ. That process was actually good regarding privacy compared to the cloud scanning. Where the legitimate claim exists is the slippery slope. What next is going to be done on phone? Apple would definitely be pressured by governments and orgs to scan for other things far outside the scope of this specific plan. No one can really believe that wouldn't happen.
    Then Apple made the problem worse by informationally rolling it out so poorly, not understanding the average user doesn't know what a hash is. All they hear is "on phone scanning". They don't know that a hash is a set length numeric value of any file, pic or not. Apple should have first, right out of the gate, shown a simple graphic of what a hash is. A super simple pic of a file generating a number. The number is compared with a number from anti CSAM orgs. Apple is a much more data private company yet they were tone deaf on the rollout.
    I believe I read on iMore the suggestion that Apple take it off phone to an added layer server. I think this will have to be done. Little chance hash comparisons on phone will happen now.