Apple is great on privacy, but what about App Store developers?

Apple Logo inside WWDC
Apple Logo inside WWDC (Image credit: Rene Ritchie / iMore)

We're watching Facebook crash and burn following not only the Cambridge Analytica scandal — where the data of more than 87 million users was misappropriated and sold to vote manipulators in the U.S. and U.K. — but years of privacy violations and misconduct, including attempts at psychological manipulation using the News Feed. Google is once again in the news for once again trying to scrape data from users who've told them, repeatedly in many cases, they don't want to share it. Everyone from government agencies to ride-sharing services have been caught abusing data and illegally tracking people, and yet, who does Bloomberg think we need to worry about? Who do we — I mean they — desperately need to get into as attention-grabbing a headline as possible?

If you're saying… oh, oh, is it Apple? Is it Apple? It must be Apple!

You're right.

Is Apple Really Your Privacy HeroThe world's most valuable company is seen as a champion for your data. It should be doing more.

Now, the second part of that sub-hed is spot on. Apple should be doing more. Everyone should be doing more. Always. Do the most you possibly can today? Good. Two thumbs up. Now figure out how to do more tomorrow. And the day after that. And the day after that. And — you get the idea.

So, now I'm drawn in, let's see where this goes!

But, Apple...

Apple Inc. has positioned itself as the champion of privacy. Even as Facebook Inc. and Google track our moves around the internet for advertisers' benefit, Apple has trumpeted its noble decision to avoid that business model. When Facebook became embroiled in a scandal over data leaked by an app developer, Apple Chief Executive Officer Tim Cook said he wouldn't ever be in such a situation. He framed Apple's stance as a moral one. Privacy is a human right, he said. "We never move off of our values," he told NPR in June.

Ok. Hold on. Long lead. Fast forward. And… there!

Bloomberg News recently reported that for years iPhone app developers have been allowed to store and sell data from users who allow access to their contact lists, which, in addition to phone numbers, may include other people's photos and home addresses. According to some security experts, the Notes section—where people sometimes list Social Security numbers for their spouses or children or the entry codes for their apartment buildings—is particularly sensitive. In July, Apple added a rule to its contract with app makers banning the storage and sale of such data. It was done with little fanfare, probably because it won't make much of a difference.

So, there it is. Apple may have industry-leading privacy policies, but it's not doing enough to stop evil developers from stealing your data. (I wonder if that's meant to include Facebook and Google…?)

Developers, developers, $%@& developers!

Cambridge Analytica did indeed involve a Facebook developer exfiltrating data from the service, but it came at the end of a long list of abuses by Facebook itself, including the aforementioned psy-ops. Time and again, Facebook would violate or misstep and Mark Zuckerberg would strut out his most exhausted "Aw shucks!" and then walk whatever was causing the controversy-du-mois back as little as absolutely possible. And when Cambridge Analytica hit, Zuckerberg did his best to lay blame entirely at the feet of developers and — in a stroke of true, cold-blooded genius — lock them out of data just as Facebook launched competing services while retaining full access to that data. Huh. Respect. Or fear. Mostly fear.

But with Apple considering privacy a human right, and going out of the way to make sure it can't abuse data by either encrypting it, never storing it, or divesting it as fast as possible, developers who are not Apple become the only possible target.

So, let's break that down.

Bloomberg's accusation is this, and I quote:

When developers get our information, and that of the acquaintances in our contacts list, it's theirs to use and move around unseen by Apple. It can be sold to data brokers, shared with political campaigns, or posted on the internet. The new rule forbids that, but Apple does nothing to make it technically difficult for developers to harvest the information.

Fast forward again…

Apple has the ingredients for a Cambridge Analytica-type blowup, but it's successfully convinced the public that it has its users' best interests at heart with its existing, unenforceable policies.

Here Bloomberg is right once again… but for entirely the wrong reason.

The policies are unenforceable not because Apple can't or won't enforce them but because locking down Contacts the way that's being suggested here would literally break the apps and functionality millions of users want and need.

The most obvious thing missing here is simply this: You hate Apple's built-in Contacts app so you go to the App Store and download a third-party Address Book replacement. It asks for permission to access your Contacts — which is really the contacts database behind the built-in app — and boom, all your information is there and ready to use.

For every action...

If Apple prevented developers from accessing your contacts, that new Address Book you downloaded would be empty. And you'd be pissed. No one expects to have to tediously copy and paste every single bit of information from every contact into every field in every new app. That's beyond broken.

Same for social and messaging apps. They use the contacts database to jump-start the network effect so, when you begin, they can let you connect to all the people you already know. If you opened your new wine-tasting network or emoji-only texter, and you were all alone, you'd, once again, be pissed, and it would seem broken.

(Which, not coincidentally, is how all of those apps would seem if you denied them permission to access your contacts, because they have to ask, and you have to approve, before they get absolutely anything with you. It's why LinkedIn is so freaking thirsty so freaking always. Seriously, chill.)

I'm not saying Contacts can't be misused or abused. Everything can. But I am saying you need to carefully consider user expectations and experience before screaming at clouds that you been done wrong.

Security, not

As to this part:

According to some security experts, the Notes section—where people sometimes list Social Security numbers for their spouses or children or the entry codes for their apartment buildings—is particularly sensitive.

Bad security expert. Bad!

The proper response to being asked a question like that is: We recommend no one ever store sensitive information in contacts cards, the same way we recommend no one ever send credit card information or passwords over plain text email. Rather than suggesting app platforms and email app users are evil, and that mail access be carefully hall-monitored, we instead suggest everyone help better educate people about best privacy practices and operational security habits. Also: How did you get this number? Dammit, LinkedIn!

Apple's argument holds when it comes to tracking phone messages or the articles users read. Certain data are indeed safer from third parties when stored on a device. But when it comes to the app developer network, that's like a parent—in this case, Apple—claiming the developer kids are well-supervised. They're not. Once Apple reviews and approves independent apps, it can't see how the data they collect is used.

Apple shouldn't be in the business of infantilizing developers or, frankly, customers. I get that some people would love to see Apple wield the Sword of the Rivan King to smote Facebook or Google off the platform, or prevent me from downloading a Contact or Calendar or Mail replacement app, but at a certain point you can't be a helicopter platform and you have to simply provide tools and rules, trust your customers to make the right choices, and punish any cases of abuse that do come up.

Personally, I'd much rather Apple spend its time deleting the ridiculous scam apps that keep popping up, either as copies of real apps or as subscription services abusers. Get on Apple's case about that. Write and scream about that. Because, damn.

Developers have access to dozens of different data points they can ingest whenever a user says yes. So the first step is obvious: Restrict them from getting any information from users' lists beyond phone numbers and email addresses.

Forgive me for saying this but it needs to be said repeatedly in the tech industry: Anything is obvious and easy if you're not the one in charge of actually doing it. You can blog or podcast or YouTube anything. It's obvious Apple should make the Lightning port dispense coffee or beer. There. Done.

But in the real world, every action has a reaction, every change a repercussion. Preventing apps from getting any information beyond phone numbers and email would kill any Address Book app in the App Store, including the one I use every day. It would also do very little to prevent actual abuse because all data brokers need to perform any advanced analytics on you, including social graph and market basket, is a single identifier. Phone and email are two.

Are three permissions twice as strong?

Now, it is interesting to consider Apple adding a third permission to contacts, similar to what it did with location — always, while using, or never. Contacts could have all, email/phone only, or none. But would it improve privacy or simply increase complexity and add confusion?

It, and the article in whole, also acts as if contacts exist in a vacuum. What about Calendar apps? What good is locking down contacts if I give developers access to my Calendar? Or my email? What about cloud services? What if I connect an app to Facebook or Google? Is Tim Cook supposed to hike over and slap a padlock on other companies as well?

The next step is redesigning the controls of the list to allow users to encrypt or decline to share certain contacts. The names in a contact list could be benign, or they could be revealing—a doctor's patients, a dealmaker's network, a journalist's sources.

HIPAA and international equivalents cover how doctors are required to handle confidential information, not Apple. And, yes, some people will have highly sensitive contacts that shouldn't be shared. And that means they shouldn't share them. Don't grant permission for other apps to access contacts, or download a second, secure contacts app or secure notes apps for anyone and everyone sensitive.

You know, be a grown ass professional.

Let's argue

I don't mean to harshen Bloomberg's privacy advocacy. I think it's super important that we all think about this all the time, and continue to demand more and better protections from all the technology companies, including Apple.

But there's many a slip twixt the topic and the click, and making users feel afraid of technology, especially enabling technology, just to get attention hurts them instead of helps. And we should all be trying to help

I very much hope Apple's security and privacy teams are working on better ways to help us control our data without pandering or limiting already limited functionality. I also hope Apple's App Store team does crack down on any and all abuses, whenever and where ever they find them. And I hope we're all working to better educate people about how to best use those controls.

Rene Ritchie

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.