March 14, 2019: Apple's new ad is all about the realities of virtual privacy
I love this ad. As I've repeated numerous times, it's easy to get a sense of how much we're paying for something when it comes to money — we see it leaving our wallets and our accounts. Even time, we see the clock ticking down. But no giant internet company is forced to show us how they vacuum up our private photos, personal messages, location, and activity. So, paying with data feels cheap. It feels free.
That's why I kinda love this new ad from Apple, which uses real-world examples of private situations and privacy measures we all understand, every day, but within the context of digital situations.
It may not convince everyone. We've gotten so used to being data rich and mortgaging our privacy to pay for this or that "free" service. But given everything that's been happening in the world, I think it's going to resonate with more and more people.
Security and the need for it online are now generally understood. It was ugly, full of malware and maleficence, but we got there. Now, it's all about the privacy.
More ads like this, please.
January 13, 2019: Tim Cook: That's my information that you're selling, and I didn't consent
Apple's CEO, Tim Cook, in an essay on privacy, just right now published in Time Magazine:
In 2019, it's time to stand up for the right to privacy—yours, mine, all of ours. Consumers shouldn't have to tolerate another year of companies irresponsibly amassing huge user profiles, data breaches that seem out of control and the vanishing ability to control our own digital lives.
This problem is solvable—it isn't too big, too challenging or too late. Innovation, breakthrough ideas, and great features can go hand in hand with user privacy—and they must. Realizing technology's potential depends on it.
Read the whole thing.
Tim Cook delivered the keynote speech at this year's International Conference of Data Protection and Privacy Commissioners, on Wednesday 24 October 2018. It's significant because Apple, as a matter of company policy, believes privacy is a fundamental human right. From Tim Cook at the very top to engineers on the front line, this belief permeates Apple and drives the company's product development process every bit as much as the technology itself. As much as Apple is designing for experience and for accessibility, the company is also designing for security and privacy.
Apple's belief in privacy is made manifest again today with the launch of an updated version of apple.com/privacy.
I had the chance to talk with Apple ahead of the new site going live and once again came away impressed not just that privacy mattered to them as an ideal but the depth and thoughtfulness of how privacy and security has been implemented within the design and development from the very beginning of every new and updated product.
New features aren't simply created and then handed off to a "privacy and security team" mandated to shellack a thin veneer of legal ass-coverage on top. Privacy and security are intrinsic parts of the product, from hardware to software, on-device and through Apple's servers.
Why Apple's stance on privacy matters
Because of their business models, Google, Facebook, and similar companies build and retain complex profiles on us, including our behavior and our relationships.
They claim not to share or sell that data, but through their advertising systems, they share and sell insight into it and us. We can quibble about how much and what patterns can be derived from them, but when you see an ad for a product you'd previously searched for, or you see your photo used in an ad, you feel exposed.
What's more, the simple act of keeping all that information, whether for themselves or for their advertising customers, creates a vulnerability. The potential for abuse, no matter how unlikely, is staggering:
- "Which way are you going to vote? If you thought information about your web history would be made public, now which way would you vote?"
- "You are transacting business in our country. Your servers will be made available to us. You will show us all interactions between individuals on the following lists…"
- "Hey, let's see what your ex is up to…"
The examples above are extreme and safeguards no doubt exist to try and prevent such exploits. But this isn't FUD. Google has stolen Wi-Fi data in the past. It has exposed the location of victims to abusers. Uber has tracked people without their knowledge. Government agency contractors have snooped outside the bounds of law and morality. This is a real, valid concern.
The only way to absolutely prevent the abuse and disclosure of personal data is not to store that data. Google, Facebook, and other large data harvesting companies simply can't do that. But Apple can.
Apple, because of its own business model, has no need to persist our data, our behavior, and our relationships on its servers. What's more, by virtue of the company's belief in privacy and security, it wants no part of our data. Instead, it collects no data if it doesn't absolutely have to, collects the minimum amount of data possible when it does have to, anonymizes and does not associate that data with any user accounts unless it absolutely has to, encrypts the data end-to-end during any and all transmissions of that data, and then keeps the data only as long as it absolutely has to.
Take features like Siri and Faces (in Photos) sync. For years, Apple stuck to on-device data because, by definition, it is more private and secure. But it's not more convenient, especially when you have to retrain Siri or re-identify photos every time you switch devices. So, now, Apple is doing secure, private sync between devices.
It's different than typical sync where the single truth lives on the server and all your devices sync to that truth. That leaves the truth exposed and vulnerable on the server. Apple's implementation encrypts the data end to end and then moves transist it via a secure version of CloudKit, the infrastructure behind iCloud.
Apple doesn't store anything and doesn't even "see" anything aside from the encrypted blobs going through the system. Only devices you've set up with your Apple ID have the keys to decrypt and make use of the data.
The services that Apple offers are also in silos, so data — and potentially identifying patterns — can't pass between them. It's part of what lets Apple scale technologies like differential privacy from detecting trends in the QuickType keyboard to detecting trends in bad websites on Safari and data types in Health.
Best of all, by continuing to publish privacy and security policies and white papers, and rolling out technologies like differential privacy, which is continues to improve with better algorithms and implementations, and neural networks for Face ID, Apple invites scrutiny. The more people examine, probe, push back, and find bugs, the more Apple has to live up to and improve. Standards are meaningless of you're not continually held up to them.
The downside of privacy-first
Everything comes at a price. Historically, it's taken Apple longer to implement secure, private versions of popular services like assistant or photo sync, and the company has yet to implement all the services Google and even Facebook offer "for free".
(I put "for free" in quotes because we actually pay for them in data, which is incredibly expensive. It's so valuable, big internet companies spend billions of dollars to harvest it and, instead of paying us for it, convince us they're doing us a favor by letting us give it to them for their services.)
Critics have said Apple's policies on privacy and security keep means the company will never catch up to its rivals. Of course, critics said the same thing about artificial intelligence, then Apple debuted A11 Bionic, silicon three years in the making.
The point is valid, though: Apple is deliberately sacrificing expediency for privacy. From Apple's point of view, privacy is inextricably linked to making the product great. And the company will take time to make what it believes are great products. That frustrates some but it comforts others.
If you do choose to use Google or Facebook, Apple still tries to help you maintain as much privacy as possible. For years, it's been providing tools that limit or prevent certain types of online tracking. This year, as part of iOS 12 and macOS Mojave, Apple has given Safari the ability to block social networking buttons and comment forms, which are used to track you across the web, and made "fingerprinting", which tries to identify the unique characteristics of your computer set up, much, much more difficult.
It's also expanded the iOS permission requesters and protectors to macOS, so apps and services have to ask you before they can use your camera or microphone, access your files or message databases.
There are still concerns around the third-party analytics some apps use, and what's done with that data, and I'd love to see the App Store permissions system expanded to include "Can we collect analytics data?" as well.
I still use Google for work and Facebook for real-life friends and relatives but I lock them down as much as possible. For all my personal stuff, though, I use Apple. I may be missing out on some features and conveniences, but it's currently worth it to me.
As someone who covers consumer technology, I'm glad there is that choice. As a consumer, I'm glad I have that choice.
We may earn a commission for purchases using our links. Learn more.
Apple shows off iPhone 11 Pro's wide angle camera in a new video
Apple's latest "Shot on iPhone" YouTube video shows us what can be done with iPhone 11 Pro's ultra wide camera.
SENA Wallet Book Case beautifully protects your iPhone and valuables
This gorgeous case holds your iPhone as well as up to three cards plus cash securely.
'App Store Confidential' is in number one spot on Amazon thanks to Apple
The controversial "App Store Confidential" book is now the number one bestseller on Amazon. All because Apple is trying to get it banned.
Don't hide your iPhone XS Max. Get a clear case!
You want to protect your iPhone, but you still want to be able to see it clearly. Show off that beautiful phone with a clear case.