Full disclosure: We have a sibling site devoted to all things Tesla. Some of my colleagues here either own or have placed a $1,000 reservation for a car sight unseen. Tesla is hot. It's CEO is hot. I do, however, have some cause for concern.
Recently we've seen reports of at least two crashes attributable to Tesla's "autopilot" or "autonomous" feature. Tesla says the feature is in beta. Beta software has a clear definition. Or it used to. Alpha code was buggy, and meant for internal use. Beta software was buggy too, but meant for outside users to test and report bugs back to the company. Gold master was what it implied, good to unleash upon the world. Time has changed, and the meanings differ. Beta software though still means "not finished". It still means "buggy".
Bugs are the new beta
Some products like Gmail were in beta for years. Now Tesla's autonomous mode is also in beta. Users can pay extra for the privilege of testing unfinished, buggy software. They need to acknowledge this though several screens that protect Tesla legally if the owner does something stupid, such as take selfie videos of themselves in the back seat while the car drives merrily along.
The problem I have is simple. If Gmail crashed in beta for you're my life wasn't affected. My life wasn't in danger. Not so much with Tesla's an their autonomous modes (that really aren't autonomous). If you engage it, as many do, click through all the legalese, and then decide to take your hands off the wheel, that's now my problem too. You see, we share public roads. If you want to take your (really not) autonomous car off road to a track, be my guest. I know a lot of folks who take their non street legal cars to do just that. No worries, we're not sharing the road.
Now, you might argue that a Tesla in autonomous mode is no worse than an inebriated driver or a driver who is busy texting, and not steering. Those drivers are making conscious decisions to do something both stupid, and illegal. To knowingly put the rest of us at risk. A beta tester of autonomous software may have no idea that autonomous isn't really that autonomous. Or that it provides one more reasons for the rest of use to to be careful around those drivers.
Yep, beta testing software that is really mission-critical is reckless no mater how many legal warning a driver has to mindlessly click through. (Have you ever read your iTunes license agreement. I thought so. Me neither.)
Apple and autonomous driving
That brings me to Apple. Apple does not ship mission critical beta software. In fact, when Apple does a "public" beta, there are a lot of warnings. There's also a lot of effort that goes into public betas — like the recent ones for iOS 10 and macOS Sierra — to make them as solid as golden masters. In fact, some beta software becomes just that.
The idea of Apple doing and autonomous car is fascinating. In fact, we discussed that on AppleTalk 3. It's hard to imagine Apple releasing any automotive feature that wasn't considered finished, though. Done. Complete.
That's probably the best reason to think we won't see an Apple car anytime soon. Sometimes it's really important to not ship something before it's ready. In the case of sorta, kinda autonomous cars, it's critically important.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
I’ve covered the personal technology beat for more than two decades at places like Gartner, Jupiter Research and Altimeter Group. I’ve also had the fun of contributing my $.02 on the topic at Computerworld, Engadget, Macworld, SlashGear and now iMore. Most recently I spent a few years at Apple as Sr. Director of Worldwide Product Marketing. On Twitter I’m an unverified @gartenberg. I still own some Apple stock.