That was a question posed by MarketWatch today. It's also an important question. Unfortunately, MarketWatch didn't treat the question that way. And that's a profound disservice to its readers.
The Apple Watch is, by Apple's own admission the most personal, most intimate device the company has ever released. It tracks health, it handles communications, it can control our homes, it can pay for our purchases. Security on the Apple Watch is something that's going to matter to everyone who uses it. The response to the sensational headline used by MarketWatch, is that they don't know. And the follow up is pure fear, uncertainty, and doubt. That's not only bad journalism, it's an actively harmful attack.
[Apple] has released little information thus far on the watch-that-knows-all's security features and told MarketWatch more information will come when the product becomes available on April 24.
The Apple Watch works in conjunction with the iPhone. WatchOS is also based on iOS. Apple has released an excellent guide to iOS security. It can serve as great starting point to become familiar with basics of how Apple handles end-to-end encryption and other related technologies.
Apple has also posted an open letter on security and privacy, and an entire root-level section of the company's website - apple.com/privacy, that outlines the philosophy behind it. In short, Apple's made privacy and security a front-line feature for the company's customers. Again, an important starting point for this type of discussion.
"I don't know enough about what's in there. That's the common challenge of security researchers or anyone who wants to make security decisions about Apple products," David Schwartzberg, a senior security engineer at MobileIron, a Mountainview, Calif.,-based mobile security company. "They don't release enough information."
Sensors on the watch can detect when a user takes it off his or her wrist. Upon removal, the watch will put Apple Pay on lockdown. To unlock it, you'll have to enter the passcode for Apple Pay. So unless a thief also has your arm, the moment the watch comes off, the paying system shuts off. Though in theory, what stops a criminal from threatening you for the passcode?
Deplorable "Has your arm" hyperbole aside, "in theory", what stops a criminal from threatening you for your phone's passcode? Your wallet? Your car keys? What about any of that is unique to the Apple Watch?
Researchers at FireEye, a Silicon Valley security firm, said in a February report that hackers have learned how to bypass Apple protocols to publicly release malicious applications. And last year, another security firm found that a type of malware called WireLurker could have infected hundreds of thousands of Apple devices.
If you jailbreak and visit a pirate app store and otherwise expressly override Apple's built-in protections. In other words, if you leave your car open with the keys in, in front of a chop-shop, then security isn't the problem. You are.
- Why most people don't need to worry about WireLurker
- Masque Attack: Don't panic but do pay attention
"If somebody's able to get a piece of malware on a device like they have with the iPhone, iPods and iPads, if this watch is doing fitness data, they could tell when you're exercising. They could tell a lot of things about you," says Brian Markus, CEO of Aries Security.
For example, someone who figures out how to hack a slew of Apple Watches could begin email marketing relevant health products (bogus or legitimate) to those consumers, targeting individuals with spam or phishing scams based on their specific needs. And a stalker could use the watch as another way to track someone's location and movements.
The bigger and more realistic danger, by near-infinite order of magnitude, is the "security" sellers who hack "reporters" into harming their readership by publishing complete and utter bull, backed up by no hard data or realistic threat assessment.
Within just a few months of the rollout, fraudsters found a way to game Apple Pay, The Wall Street Journal reported last week. Banks responded by increasing verification measures to leave less room for crooks to upload stolen card information onto Apple Pay.
Except, no. They absolutely didn't.
Apple Pay wasn't gamed at all. It was and is so secure all "fraudsters" could do was run old-fashioned social engineering attacks on banks that admitted they not only chose not to invalidate stolen card data, but authorize it for purchase.
Mobile security is lagging at a time when people carry nearly as much information on their smartphones as on their computers. Smartphones, across the board, don't by default include antivirus software, and many users don't install it onto their devices. And the Apple Watch is basically an iPhone for your wrist.
Which has among the best security in the industry. Would that MarketWatch aspired to such pride of craft.