I love technology. It's our bridge to the future.
To build that bridge, however, some technology will be embraced and fuel us for years, whereas some will fall by the side, a brief flash along the way. When it comes to mobile, the iPhone is clearly an example of the former. When it comes to wearables, Google Glass is clearly an example of the latter. But what about the Apple Watch? By analyzing human behavior, and comparing how both Google and Apple went to market with their first wearables, can we gain any insight into its fate?
Psychologically speaking, there are very specific reasons why Google Glass failed. That Google didn't understand and foresee them is both predictable and surprising.
To be clear, Google Glass wasn't a retail product and wasn't sold in stores. It was an experiment and a very publicly positioned one. But that's what Google chose to go with first, and how they chose to go about it.
The early adopters of Google Glass—the "explorers"— were staunch technophiles and Google enthusiasts. They were the type of people who thrive on being at the cutting edge and don't mind investing their time, effort—and $1500 a pop—to use and be seen using Google Glass first. The social consequences, unfortunately, weren't on their radar any more than they were on Google's.
Through evolution, we have learned that living in social groups greatly increases our chances of survival. That's why our need to feel connected and accepted, physically and emotionally, are exceptionally high. It's also why feeling ostracized from our social groups can be devastating. Studies have shown that people excluded from even minor social activities can express anger, anxiety, depression, and shame (Baumeister and Leary, 1995; Eisenberger et al., 2003).
Google Glass, for all its technological wonders, separated its wearer from social groups.
Google Glass, for all its technological wonders, separated its wearer from social groups. Part of that was physical—it grafted itself to your face and intruded itself in our gaze. It made us look more like the Borg than like ourselves and impossible not to notice. Part of that was emotional—popular perception became that anyone wearing it was creepily recording us at any time. It created a disconnection, more than a connection.
The first part was the result of poor design decisions. The second part, media sensationalism. But as pictures of people wearing Google Glass spread, and articles about establishments barring Glass users or altercations arising around Glass usage, the social stigma surrounding it grew.
That stigma transferred from what was being worn to the person wearing it, most infamously though the use of the derogatory term, "glassholes".
The eyes have it
Humans have a deep and abiding need for socialization. It's strongly linked to our feelings of happiness and wellbeing. When developing technology so intimate that it needs to be connected to our bodies for long periods of time, the developers need to be mindful of those dynamics and the technology respectful of shared experiences.
There's a reason why Alien terrified us with face-huggers rather than wrist-huggers.
That's why where an object is worn is so important, especially when that object contains new technology. It will have needs of its own, and those needs can't come before the wearer's. Google put Glass on our face and in front of our eyes because that was the most efficient, most logical place to put a screen and a connection to the internet.
But it was too soon. We hadn't gotten used to wearables in general, much less ones so prominently positioned.
With Google Glass, there was no way to not see it. It was a persistent visual barrier that directly interfered with one of the most primal and important ways humans interrelate—through the eyes.
The eyes are how we connect. We have specific neurons in the infertemporal cortex that fire with facial recognition. They're integral to our social constructs and linked to our emotional intelligence. There's a reason we say "eyes are the windows the soul," and why Alien terrified us with face-huggers rather than wrist-huggers.
With Google Glass, instead of seeing the eyes and the face, we see something strange and amazing. We notice Google Glass before we see the person behind it.
Decades ago Harlow showed the necessity of social interaction even over basic needs. He allowed baby rhesus macaque monkeys to choose between a mechanical "mother" that was warm and cloth, or a mechanical "mother" that was cold and metal but was able to feed the babies milk. The babies choose to spend their time with the warm, cloth mother, only reaching over for food when absolutely necessary. (Harlow, 1958; American Psychologist).
Harlow also showed the devastating effects of this social isolation and rejection on the monkeys. He found that monkeys that were unable to spend a lot of positive interactions from other monkeys increased their social isolation.
With Google Glass, the lack of positive interactions simply caused people to stop using or wanting it.
Watching the wrist
The Apple Watch is both similar and different to Google Glass. It's similar in that it's the first major wearable from one of the biggest technology companies on the planet, and none of us yet know where exactly it will fit in. It's different in that Apple isn't starting with the face. Apple is starting with the wrist.
Some people will still buy the Apple Watch—or Android Wear, Google's more recent foray into wearables—for the same reason they bought Google Glass. They'll want to be the first to have, to try, and to show off the latest technology. But long term, early adopters will only keep using it, and the mainstream will only start adopting it, if it fits their needs and helps them live better lives.
The advantage the Apple Watch has is that it isn't on the face and isn't constantly in our lines of sight.
The advantage the Apple Watch has is that it isn't on the face and isn't constantly in our lines of sight. It's on the wrist, which is a place people became comfortable wearing technology decades ago. When we look at someone who's wearing an Apple Watch, we may not even see it. But we will see them, unobstructed, as a person.
There will still be growing pains. The Apple Watch is still on our bodies. Holding it up for anything longer than seconds isn't ideal. Trying to use the small screen the way we've gotten used to using bigger phone and tablet screens isn't practical. We'll try to learn how to keep things brief and to use controls like the Digital Crown. If we like it, Apple Watch has a real shot at becoming part of our lives. If we don't, it too will struggle.
The Apple Watch can still intermediate human connections, but only intermittently. Even more so than the phone, the watch is designed for for brief interactions, for glances. Not for anything permanent or persistent.
So far, engaging with someone wearing an Apple Watch feels far more comfortable than engaging with someone wearing Google Glass. The idea of having to interact with someone wearing an Apple Watch isn't a concern, where being put in the same position with someone wearing Google Glass still feels immediately stressful.
The difference between Google Glass and Apple Watch may be one of impatience vs. patience, of face first vs. wrist first, of unavoidable vs. inobtrusive. Psychologically speaking, though, it's all the difference in the world.
For Google's first wearable, they shot for the moon and failed. For Apple's, they shot for the human and have a chance at succeeding. If the Apple Watch—or Android Wear—does prevail where Google Glass failed, however, it won't entirely be because of technology: It will be in part because of psychology.
Perhaps, in time, wearables will slowly move from the wrist to face—the same way Locutus of Borg was a horror and Seven of Nine, eventually, a hero.
The effects of social exclusion are devastating. Any company that wants to involve itself in how we interact with each other has to be mindful of that.
If we want to build a bridge to the future, it will take more than just technology: It will take patience and understanding of human psychology.