What you need to know
- Bill Maher has blasted Apple's plans to scan iCloud Photos for Child Sexual Abuse Material.
- He said the move was a blatant constitutional breach.
- He also claimed Apple should admit that the problem with iPhones is that they turn people into assholes.
Bill Maher has absolutely blasted Apple's plans to scan iCloud Photos for Child Sexual Abuse Material, and says that the company should instead admit its phones turn people into assholes.
Maher's strong comments, reported by Deadline, came during the New Rules segment of Real Time:
He said that Apple "nosing through everybody's private photo stash" was casting an "awfully wide, intrusive net", labeling the measures a "blatant constitutional breach". He went on to say phones should be private like wallets or purses before asking "What about probable cause? What about the 4th Amendment?"
As the report notes, Maher's comments then took a bit of a turn:
Maher lamented that phones "make people live fake lives" where "It's more important to get a picture of you having a good time than actually having a good time." He also claimed phones "make people bullies. Angrier. More vitriolic. More racist online than they would ever dream of being if they had to say those things to someone's face. The phone made us passive-aggressive to our friends and hyper-aggressive to total strangers."
Apple's Child Safety Measures have drawn criticism from some privacy advocates and security experts. Since the announcement, Apple has fervently defended itself from criticism, with software chief Craig Federighi admitting that Apple had wished its message had come across more cleanly.
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.
Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9
At the end of the day Apple is not only going to perform on-device scan of their customers images. But Apple still has to scan for this CSAM content on their own iCloud servers, especially when Apple customers can still use a browser to send any images, photos, or any other data to their iCloud accounts, so that user could send some CSAM material to their iCloud storage. Therefore Apple has to continue to scan their iCloud storage. So then you have to ask yourself why scan on-device at all. Well Apple did say that this on-device scanning can evolve over time. So Apple could start to scan their users Safari browser images down the road, because if they really want to stop those people from using Apple products, then you have to hunt for those preditors. If they know that Apple is only going to scan via iCloud, or when a user only sends images, or photos to their iCloud storage. Then they could forget about saving any of those images, and just browse to their hearts content using their iPhones Safari browser. Right? So I can definitely see Apple changing and scanning for a lot more things down the road, and maybe only target certain devices, or people as well. Apple could even start to scan a users device while they are sleeping. Apple has taken a turn for the worst IMHO.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.