Apple's CEO, Tim Cook, in an essay on privacy, just right now published in Time Magazine:
In 2019, it's time to stand up for the right to privacy—yours, mine, all of ours. Consumers shouldn't have to tolerate another year of companies irresponsibly amassing huge user profiles, data breaches that seem out of control and the vanishing ability to control our own digital lives.
This problem is solvable—it isn't too big, too challenging or too late. Innovation, breakthrough ideas, and great features can go hand in hand with user privacy—and they must. Realizing technology's potential depends on it.
Thomas Brewster, in an article just days ago in Forbes.
A California judge has ruled that American cops can't force people to unlock a mobile phone with their face or finger. The ruling goes further to protect people's private lives from government searches than any before and is being hailed as a potential landmark decision.
The need for both regulations and rights. It's exactly the conversation I wanted to help kickstart in the RIGHT TO REMAIN PRIVATE video I posted last weekend. In the age of ubiquitous data harvesting, it's one of the most important conversations all of us, here on this channel and out in the world, have to have.
I didn't intend to follow that video up so closely with another. But I don't every time get what I want.
And, in this case, it's for a couple of damn good reasons.
Companies have utterly failed to regulate themselves. Over the last few years, and last year especially, we've had one incident of data abuse after another, one data breach after another. We've had our personal, private information bought and sold and leaked for politics and money and marketing.
So, the time has come to consider government regulation with fines and jail time considerable enough that it becomes in the companies' best interests not to fail, not to avoid, not to abdicate, not to procrastinate, not to ship without privacy or to try to bolt it on later, but to build products with privacy from the start.
Last year, Tim Cook addressed this in his keynote at the 40th International Conference of Data Protection and Privacy Commissioners. This year, he's doing it in Time Magazine.
And, in addition to the data minimization, transparency, accessibility, and security rights Cook enumerated last time, and I covered in the previous video, he added this:
Meaningful, comprehensive federal privacy legislation should not only aim to put consumers in control of their data, it should also shine a light on actors trafficking in your data behind the scenes. Some state laws are looking to accomplish just that, but right now there is no federal standard protecting Americans from these practices. That's why we believe the Federal Trade Commission should establish a data-broker clearinghouse, requiring all data brokers to register, enabling consumers to track the transactions that have bundled and sold their data from place to place, and giving users the power to delete their data on demand, freely, easily and online, once and for all.
There are a couple of factors when it comes to paying with data that I think are critical to recapitulate here:
- Everyone has different amounts of time and money, but we're all rich when it comes to data. Even though it's invaluable to the companies that spend billions to provide us with products designed to harvest it, we treat it as though it has no value to us.
- We see clocks ticking away and money leaving our wallets and our accounts. We don't see our personal messages and photos being sucked up into the cloud, or the big internet companies hiding behind our beds or following us from behind the bushes, so the cost doesn't register.
I'm not saying computers should have to animate our private photos going up to the cloud or have an icon of a little ninja peeping at us as we travel around. But, to Cook's point:
One of the biggest challenges in protecting privacy is that many of the violations are invisible.
We need to drop a can of paint on it, one way or another.
The second thing I covered was how we also needed protection from the government, both when it comes to evolving existing rights like the one against self-incrimination, and when it comes to extra-legal activities like warrantless surveillance or demands of back-door access into operating systems.
That brings us to the ruling of magistrate judge Kandis Westmore:
The order came from the U.S. District Court for the Northern District of California in the denial of a search warrant for an unspecified property in Oakland. The warrant was filed as part of an investigation into a Facebook extortion crime, in which a victim was asked to pay up or have an "embarrassing" video of them publicly released. The cops had some suspects in mind and wanted to raid their property. In doing so, the feds also wanted to open up any phone on the premises via facial recognition, a fingerprint or an iris.
Previously, some courts had ruled that passcodes couldn't be compelled because they were commensurate with testimony, which is protected. But, that biometrics were physical and so didn't enjoy the same protection.
Why should a passcode, fingerprint, and facial scan be treated differently if they all accomplish the same thing?
that's just what Westmore focused on in her ruling. Declaring that "technology is outpacing the law," the judge wrote that fingerprints and face scans were not the same as "physical evidence" when considered in a context where those body features would be used to unlock a phone.
"If a person cannot be compelled to provide a passcode because it is a testimonial communication, a person cannot be compelled to provide one's finger, thumb, iris, face, or other biometric feature to unlock that same device," the judge wrote.
How much the weighs on future court rulings, we'll have to wait and see. But, it's a precedent in the right direction at the very least.
And yeah, as always, I'm also concerned with law enforcement having access to the evidence they need to solve the cases we need them to. In this specific case, the judge said other means were available, including getting the data from Facebook itself.
But all of this is going to have to be carefully balanced so that what's in our minds, memories, and thoughts, remain sacrosanct, even as the digital world extends and eventually integrates with them.
Tim Cook recently told CNBC that:
"If you zoom out into the future, and you look back, and you ask the question, 'What was Apple's greatest contribution to mankind?' It will be about health," Cook said.
I think privacy will be right up there, though. Because, without privacy, Apple won't have the trust that they need to really change the healthcare world. And so many other worlds beyond it.