Apple removes mention of controversial CSAM feature from webpage

Csam Siri And Search
Csam Siri And Search (Image credit: Apple)

What you need to know

  • Apple has removed mention of its CSAM feature from its Child Safety webpage.
  • The company says that the feature has merely been delayed rather than canceled.

Apple's controversial CSAM feature has disappeared from the company's webpage.

Spotted by MacRumors, Apple has removed mention of its controversial CSAM feature from its Child Safety webpage.

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

As reported by The Verge, Apple spokesperson Shane Bauer said that the company has not abandoned the feature at all but is taking more time to collect feedback and make improvements before it releases the feature. They did not say exactly when anyone should expect the feature to be released.

When reached for comment, Apple spokesperson Shane Bauer said that the company's position hasn't changed since September, when it first announced it would be delaying the launch of the CSAM detection. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the company's September statement read.

Apple had originally announced the feature back in August and received a ton of backlash due to security and privacy concerns. Despite the company's efforts to clarify more how the technology worked and how it would be used, the backlash did not go away, causing the company to delay the feature indefinitely.

It seems that Apple is planning to take a second go at the feature in the future, but is wiping away its first attempt before doing so.

Joe Wituschek

Joe Wituschek is a Contributor at iMore. With over ten years in the technology industry, one of them being at Apple, Joe now covers the company for the website. In addition to covering breaking news, Joe also writes editorials and reviews for a range of products. He fell in love with Apple products when he got an iPod nano for Christmas almost twenty years ago. Despite being considered a "heavy" user, he has always preferred the consumer-focused products like the MacBook Air, iPad mini, and iPhone 13 mini. He will fight to the death to keep a mini iPhone in the lineup. In his free time, Joe enjoys video games, movies, photography, running, and basically everything outdoors.

1 Comment
  • Its obvious that Apple doesn't have to mention CSAM scanning anymore, especially since iOS 15.2 and above now come with nudity detection software built in. I know Apple has said that their new nudity detection software in 15.2 will only be used on kids iPhones and iPads, and it will only check if those kids are sent any iMessage, or messages in general, and photos for any nudity. Parents also have to enable this for their kids, and then those parents will also get a notification as well. Now think about it for a moment. This nudity detection is for any type of nude images, which includes nude images from males or females, or even kids. So with that said, this can still do something similar to what CSAM was doing. You have to still ask whats to stop Apple from enabling this on customers who are not kids? Apple could technically enable this software for almost any Apple customer they want to check on. Keep thinking, since that nudity detection software will be on every iPhone device that has iOS 15.2 and above, then Apple could add a simple tiny update from here on out, that could turn on their nudity detection software for ALL iPhones. Also try and remember here who has made, and controls this nudity detection software, its Apple. People should be up in ARMs about this update, and Apple got their wish of putting CSAM locally on board every iPhone now, only its really CSAM in a different form. Just 4 months ago people were mad at Apple for Adding CSAM to every iPhone user. But since Apple has worded it differently, and said that it will only be used by parents, and for their kids. This nudity detection software will now be on every iPhone, and Apple could at a later date enable this nudity detection for any Apple customer they want, and instead of the notifications going to parents, the notifications could go directly to Apple instead. Privacy is just a word that Apple loves to throw around, but that word means nothing now.