Apple employees express privacy concerns with the company's new child protection features

Apple Park photo of the side of the main building
Apple Park photo of the side of the main building (Image credit: Apple)

What you need to know

  • Apple's new child safety features are sparking privacy debates within the company.
  • Many Apple employees are expressing concerns about the technology being misused in the future.

Apple's new child safety features are not only receiving criticism from the outside but internally as well.

As reported by Reuters, employees within the company are raising concerns about the new features, which claim to protect children while still providing the privacy that Apple is known for. According to the report, an internal Slack channel dedicated to the new feature has received over 800 messages, many of them expressing concern about how the new features could be abused and expanded.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Emma Llanso, a project director at Center for Democracy and Technology, who has pushed back against Apple's new features, says that moving forward with the technology would undo years of privacy promises from the company.

"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in ... It seems so out of step from everything that they had previously been saying and doing."

Apple's new child safety features span across Messages, iCloud Photos, and Siri to detect Child Sexual Abuse Material (CSAM). The company's head of privacy recently sat down for an interview to assure customers that the new technology does not impact the privacy promises the company has made.

First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

The new features are currently planned to launch as part of iOS 15 and iPadOS 15 when they release to the public this fall. If you'd like to learn more about the new features, you can read our FAQ.

Joe Wituschek
Contributor

Joe Wituschek is a Contributor at iMore. With over ten years in the technology industry, one of them being at Apple, Joe now covers the company for the website. In addition to covering breaking news, Joe also writes editorials and reviews for a range of products. He fell in love with Apple products when he got an iPod nano for Christmas almost twenty years ago. Despite being considered a "heavy" user, he has always preferred the consumer-focused products like the MacBook Air, iPad mini, and iPhone 13 mini. He will fight to the death to keep a mini iPhone in the lineup. In his free time, Joe enjoys video games, movies, photography, running, and basically everything outdoors.