iCloud users say they're downgrading because of CSAM scanning

iCloud: Everything you need to know!
iCloud: Everything you need to know! (Image credit: iMore)

What you need to know

  • Apple recently announced new measures that will scan for Child Sexual Abuse Material in iCloud photos on-device.
  • Users have taken to Reddit to express their displeasure at the move.
  • Multiple users indicate they will downgrade their iCloud storage as a result.

Some users of Apple's iCloud platform say they are going to downgrade their plans and stop using iCloud photos in response to recently-announced Child Safety measures that include scanning iCloud photos for CSAM images.

A reddit discussion post with nearly 500 upvotes and over 800 comments was started Tuesday and asks "Is anybody downgrading their iCloud account in light of the recent news regarding hashing people's photos?"

User JonathanJK stated that they'd taken two hours "going through my settings, deleting emails and photos to create an offline back up work flow" and had realized that the settings were tedious and time-consuming, that lots of their data was going to iCloud unnecessarily, that they could get by with the free 5GB tier, and that the "cleansing itself is good for the soul."

Multiple users responded in kind:

I have to do some planning first, but I will.Same here. Thinking about setting up NAS storage for photos.Yeah downgraded to free. If they're going to f**k with my data, then I'll store it myself.

As expected the thread sparked a huge debate in the comments. Apple announced new Child Safety measures last week including plans to scan for Child Sexual Abuse Material within photos uploaded to iCloud using on-device hashing to match images to a database of known CSAM content provided by the NCMEC and other organizations. Some privacy advocates have raised eyebrows at the move, which continues to be a hotbed of discussion online. Apple's CSAM scanning does not apply to photos that aren't uploaded to iCloud, so users can "opt-out" by disabling the feature.

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9

1 Comment
  • You really have to ask yourself why does Apple really want to scan on-device for this illegal content in the first place? Especially since Apple users can still send images to their iCloud accounts via a web browser. So at the end of the day Apple still has to scan for this illegal content on their servers no matter what they do on the client side. Privacy now means nothing, its only a word that Apple likes to use.