Skip to main content

Apple delays controversial child safety features following feedback

iCloud: Everything you need to know!
iCloud: Everything you need to know! (Image credit: iMore)

What you need to know

  • Apple announced controversial new child safety measures earlier this year.
  • Following feedback, the company has decided to delay the move.
  • It says it will collect feedback and improve the measures over the coming months.

Apple has today announced that it is delaying its controversial Child Safety measures following feedback from various groups.

In a statement the company said:

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

Apple announced measures recently that it hoped would make its platforms safer for children, however, plans to scan iCloud Photos for child sexual abuse material and a machine learning feature that could detect explicit images in messages sent to children have been met with strong resistance by some.

The company was forced to publish a series of clarifications, updates, FAQs, and more, and software chief Craig Federighi even appeared in an interview, admitting the company wished its message had been clearer.

Apple's plans were criticized by privacy advocates including Edward Snowden, and an open letter calling on Apple to halt the plans garnered thousands of signatures. The company had previously indicated that it had factored into its timeline of features a period to help answer questions and clarify the measures and that it was not planning to delay the feature. Today's statement confirms the u-turn in the policy.

Stephen Warwick
Stephen Warwick

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.

8 Comments
  • Good guy, Tim Apple.
  • Lets be honest here, if Apple were to continue on with this on-device scanning, then Apple would NEVER be able to use the word privacy any more. Thats all it would be, just a word 'privacy', that means nothing when referring to Apple anything.
  • Hey there. We noticed you bought an android phone. Be an awful shame if we decided you were a pedophile, wouldn’t it?
  • They already lost me as a customer. I traded in my iPhone 12 Pro Max for an S21 Ultra. I don't want to do business with a company that treats every last customer as a potential child sexual predator.
  • Yea, I won't bank at institutions that have a security guard. How dare they assume everyone coming in is a crook. Won't drive in cities with red light cameras either.
  • FINALLY. Apple backs down from this incredibly stupid idea. Hopefully, whoever the fool was who came up with this idiotic plan is now seeking employment in the fast food and/or hospitality industries. LOL, “controversial”. Universal “WTH are they thinking?” is now “controversial”? Are you dizzy from spinning so fast? This was an astonishingly stupid idea. That Apple (1) announced this and then (2) complained about a “screeching minority” and then (3) took weeks to backtrack on this is just mind boggling. Seriously. Whose idea was this? Is this person STILL employed by Apple?
  • I was amazed how out of all the issues/"gates" that apple has had, this was the one that literally united everyone for a cause. I get the "for the kids" aspect but nothing about this seemed like a good idea and I am glad apple opened its eyes and listed to the many who voiced concerns. (especially the privacy groups).
  • I don't really see the issue...beyond potential. Every service, Google, MS, Facebook, etc. is already scanning for CSAM on their cloud services. They are required to. They use both similar hash technology, AI and humans looking at your stuff. This is a purely electronic method of detecting images in the DB on the device, before it gets 'shared'. The fact it exists on your device isn't even noted (though maybe it should be. That's a different legal question).
    The 'potential' seems to be the CSAM DB could be replaced by some other DB at the request of some authority. You could do that already on the cloud storage. It is just as incumbent on the providers to resist that, as it would be to resist the broadened DB. This doesn't make that any more or less likely. The stuff you have on your phone, is still exclusively your stuff, until you decide to do something with it you shouldn't.