What you need to know
- Apple's new child safety features are sparking privacy debates within the company.
- Many Apple employees are expressing concerns about the technology being misused in the future.
Apple's new child safety features are not only receiving criticism from the outside but internally as well.
As reported by Reuters, employees within the company are raising concerns about the new features, which claim to protect children while still providing the privacy that Apple is known for. According to the report, an internal Slack channel dedicated to the new feature has received over 800 messages, many of them expressing concern about how the new features could be abused and expanded.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
Emma Llanso, a project director at Center for Democracy and Technology, who has pushed back against Apple's new features, says that moving forward with the technology would undo years of privacy promises from the company.
"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in ... It seems so out of step from everything that they had previously been saying and doing."
Apple's new child safety features span across Messages, iCloud Photos, and Siri to detect Child Sexual Abuse Material (CSAM). The company's head of privacy recently sat down for an interview to assure customers that the new technology does not impact the privacy promises the company has made.
First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.