Csam Siri And SearchSource: Apple

What you need to know

  • Apple has removed mention of its CSAM feature from its Child Safety webpage.
  • The company says that the feature has merely been delayed rather than canceled.

Apple's controversial CSAM feature has disappeared from the company's webpage.

Spotted by MacRumors, Apple has removed mention of its controversial CSAM feature from its Child Safety webpage.

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

As reported by The Verge, Apple spokesperson Shane Bauer said that the company has not abandoned the feature at all but is taking more time to collect feedback and make improvements before it releases the feature. They did not say exactly when anyone should expect the feature to be released.

When reached for comment, Apple spokesperson Shane Bauer said that the company's position hasn't changed since September, when it first announced it would be delaying the launch of the CSAM detection. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the company's September statement read.

Apple had originally announced the feature back in August and received a ton of backlash due to security and privacy concerns. Despite the company's efforts to clarify more how the technology worked and how it would be used, the backlash did not go away, causing the company to delay the feature indefinitely.

It seems that Apple is planning to take a second go at the feature in the future, but is wiping away its first attempt before doing so.