Australia demands Apple report on measures to tackle child sexual exploitation material
Meta, Microsoft, and Snap are also in the firing line.
Australia's eSafety Commissioner has issued legal notices to Apple, Meta, Microsoft, and more demanding the companies report on the measures they are taking "to tackle the proliferation of child sexual exploitation material on their platforms and services."
In a press release Tuesday the body stated that it had sent the notices "to some of the biggest tech companies in the world" requiring that they report on steps taken to stop online child sexual exploitation.
The expectations reportedly set out "the minimum safety requirements expected of tech companies who wish to operate in Australia and the steps they should take to protect Australian users from harm."
Child protection
In a statement, Australia's eSafety Commissioner Julie Inman Grant said "the Basic Online Safety Expectations are a world-leading tool designed to encourage fundamental online safety practices and drive transparency and accountability from tech companies. They will help us ‘lift the hood’ on what companies are doing - and are not doing - to protect their users from harm."
She warned against the unchecked spread of "this horrific material" as companies moved towards encrypted messaging and deploying live streaming.
Companies like Apple will be required to respond within 28 days or could find themselves facing fines of up to $555,000 a day.
Apple has already tried to roll out some child protection measures on its platforms to varying success.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Child protection measures announced last year caused controversy because of Apple's plans to scan iCloud Photos for the hashes of known child sexual abuse material. Apple also introduced a much less intrusive communication safety feature in Messages that can detect when children may have sent or received sexually explicit images, blurring the photo and warning the child about its possible content.
Apple delayed the rollout of CSAM after backlash later last year and has since remained silent on the issue, even removing mentions of the plan from its website.
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design. Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9