A lot of people are getting a lot of things wrong about Apple's stance on privacy and security, and what it means for the future.

Last week Apple's CEO, Tim Cook, once again reiterated the company's belief that people have a right to privacy and security, and that the cost of free-as-in-paid-for-by-your-data services isn't always clearly understood. Those for who believe the conversation about privacy and security is the most important of our generation appreciated someone as powerful and influential as Cook giving it the spotlight it deserves. Those who believe the advancement of technology requires the relinquishing of previously held beliefs about privacy and security, however, reacted harshly. The problem is, many of them also reacted in a way that's just plain wrong.

It's vital to understand that privacy and security, while often mentioned together, are not one and the same. Privacy demands security, but security does not demand privacy. Historically, privacy has often been violated in the name of security.

It's equally vital to understand that everything has a cost. That cost can be in money, or it can be in time, data or attention. Apple products tend to cost money. That's easy to understand. Building something from scratch tends to cost time, which is also easy to understand. Giving up data and attention is different. There's no cash leaving a wallet, or a clock ticking away to show us the value of the time we're spending. Maybe if we were forced to watch all our emails and phone numbers and URLs and credit card numbers scroll by as we "spent" them it would be more apparent. But the way it is now, humans are really good at mortgaging our future security for our present convenience, and data and attention often seem like no price at all.

Here's a study from the Annenberg School of Communications about how marketers are misrepresenting consumers and opening us up to exploitation:

A majority of Americans are resigned to giving up their data — and that is why many appear to be engaging in tradeoffs. Resignation occurs when a person believes an undesirable outcome is inevitable and feels powerless to stop it. Rather than feeling able to make choices, Americans believe it is futile to manage what companies can learn about them. Our study reveals that more than half do not want to lose control over their information but also believe this loss of control has already happened.

By misrepresenting the American people and championing the tradeoff argument, marketers give policymakers false justifications for allowing the collection and use of all kinds of consumer data often in ways that the public find objectionable. Moreover, the futility we found, combined with a broad public fear about what companies can do with the data, portends serious difficulties not just for individuals but also — over time — for the institution of consumer commerce.

This debate isn't just raging over Facebook and Google either. How many people are aware that unless location is disabled Instagram can make it trivial for someone to find where you live? Or that Uber wants to start tracking you even when you're not using the company's app, or that PayPal wants to opt you into spam and telemarketing whether you like it or not.

The truth is you can have incredibly good, incredibly powerful services that also are completely secure and maintain our privacy is every way that matters. If a service is missing features or is buggy, that has nothing to do with privacy or security. That has to do with it missing features and being buggy.

If someone says Apple is buggy because of privacy and security, they're technically wrong. If someone says Apple won't be competitive in the future because of privacy and security, they're conceptually wrong. Privacy and security aren't a limitation. They're a foundation.

Look at handoff. For years cloud companies have synced data. Draft an email on one device, and it would near-instantly be saved to drafts on every other device logged into the same account. Last year Apple did one better — they synced activity. Draft an email on one device, and the email client on every other device within reach would populate with that same email, in that same state.

With data sync, if you wanted to switch devices, you'd have to go get the other device, find the requisite app, navigate to the proper folder, open the email, and then scroll to where you left off. With handoff, you'd just pick up the device, swipe/click, authenticate if needed, and keep on typing.

The truth is, you can have incredibly good, incredibly powerful services that are also completely secure and maintain our privacy is every way that matters.

Because you have to be within reach (Bluetooth range), you don't have to worry about someone at work accessing the email you're drafting at home, or someone on one side of the house snooping on the web page you're browsing on the other side.

It's private, it's secure, and conceptually it's better than what the data-centric companies had offered. It also doesn't require that activity be sent to and stored on someone else's servers.

By contrast, when I first got my Nexus 5 and it asked if I wanted to use Google Now, I said "yes." Then it asked for permission to track my web history, and I said "no." At that point it told me I couldn't use Google Now. Which is BS. I could easily use everything about Google Now that doesn't require my web history, which is an incredible amount. But Google wants that data so, at least back then, it was all or nothing.

That's where I see the difference. Apple could provide similar services where if I declined to allow access to any specific data, it would happily exclude that data and provide me whatever it still could based on on whatever I was comfortable sharing.

What's more, just like fingerprints and credit cards never leave the hardware, any data I deem strictly private could stay on the device but still be accessed on that device.

Apple has, in the past, been extremely reluctant to keep and operate on customer data on the company's servers. Yet because Apple understands the concept of "nearline," where local and online data can co-exist within the same service, they could apply that concept to customer data as well.

If I don't want to go to the cloud, the cloud can come to me.

If I don't want to share something with Apple's servers but it's on my phone, they don't have to bring me to the cloud. They can bring the cloud to me. If I don't want to share my web history, they can calculate the result on the cloud, then check my local device for matches right before displaying it on the screen.

None of this interferes with "machine learning." Apple already asks for permission to do just that with Siri and Maps and other services today, and they can ask to do it with future services tomorrow. They can just do it in a way that respects privacy and security, and with a business model that's funded by me directly, not by a third party because of me.

NSA whistleblower Edward Snowden recently said this to the New York Times:

Basic technical safeguards such as encryption — once considered esoteric and unnecessary — are now enabled by default in the products of pioneering companies like Apple, ensuring that even if your phone is stolen, your private life remains private.

Some people were recently irked that they'd have to buy new bridges or hubs for home automation, thanks to Apple's requirement for end-to-end encryption in HomeKit. I was irked my bridges and hubs weren't end-to-end encrypted from the start.

Now, thanks to Apple's stand on privacy and security, they will be. And I can't help but hope that thanks to Apple's stand on privacy and security in general, all services from all companies will be under immense pressure to be more private and secure as well.

And that benefits everyone.