Matthew Panzarino, co-editor of TechCrunch joins Rene to talk about why mobile is going wearable, how contextual information and sensors are evolving, and how Apple, Google, Samsung, and everyone else will try and sell them to us.
Yell at us on Twitter or leave a comment below.
Rene Ritchie: Matthew Panzarino, co-editor of TechCrunch, how are you?
Matthew Panzarino: Just fine, sir. How are you?
Rene: Very well, thank you. First, congratulations on the new gig. I know it's not brand brand-new, but it's still new-ish.
Matthew: Yeah, it's new-ish, for sure. I'm enjoying it. I'm just getting up to speed. Everything is flowing pretty well. It's a whole different ballgame, but I'm enjoying it. I'm enjoying it a lot.
Rene: We were talking before the show. The only metaphor I can give for people who aren't familiar with how this industry works is that you're running as fast as you can while your jacket is caught in a steamroller that is slowly working its way towards you from behind.
Matthew: Yeah, that's pretty much it. Every once in a while, you take your jacket off, put on a new jacket, and then it starts happening all over again.
Rene: There's always another steamroller.
Matthew: Yeah, absolutely.
Rene: One of the things I wanted to talk to you about is what's been in the news very recently. It's getting a lot of attention. That's this entire idea of wearables, whether it's stuff we already have, like Pebble watches, or stuff people are talking about, like iWatches. Is this a real trend, Matt, or is this something that we're sort of looking at Dick Tracy and the future, and just really wishing it would be here?
Matthew: Yeah, I think there's some of that wish fulfillment. I think that there have been wearable devices for a long, long time. You're going to get a lot of people digging into the past of wearables now that it's become such a hot topic, and obviously if you're a student of tech wearable history, you know about calculator watches, and the various audio and video products that we've strapped on to our wrists and faces.
The Sony Glasstron was supposed to be the next big thing in home entertainment. That was a set of glasses you strapped to your face that Sony made, that had the video screens in them. There's been a variety of these things that have popped up over the years.
I think that, obviously, certain trends like miniaturization, and very power conscious microprocessors and coprocessors, and things like that, are leading towards an inflection point where we're starting to get wearable devices that are truly capable and powerful of handling multiple tasks, and multiple vectors of information gathering, versus glasses that you have to plug into an enormous battery pack, or your cigarette lighter while you're driving in the car, that you can watch video in the backseat, or something, the very cumbersome feeling.
I think that there certainly a technological inflection point that we're hitting now that will enable new experiences, but I think a lot of the stuff that has been produced so far is hung up on older paradigms that aren't forward thinking. They're not forward looking and they tend to get a little hung up on what we wished was going to happen versus what we're actually capable of now, that we have the modern technology.
People are trying to fulfill dreams that they had years ago instead of going, "That was interesting, but what's actually possible now. Now that we have a mini supercomputer in our pocket, what kind of other others can we do that aren't taken care of by that?"
That thing is our TV. We don't need a TV on our wrist really, so what can do with the wearables now that we didn't imagine before or didn't have a path towards before?
Rene: It's interesting to me and this is going to sound silly. But the in the original "Star Trek," they had the communicator which no looks outdated by modern phone technology. But "Star Trek: The Next Generation," they had that little broach on their chest that they could just tap.
They went from a handheld to a wearable and it just seems that even then they knew that the future would be even more personal technology.
But just like the shift from when you had a Mac, or Linux, or a Windows PC on your desktop and the Windows Phone, Android Phone, or iPhone is nowhere near as powerful as those things, but we were willing to use them because they gave us so many more advantages and mobility and constant connectivity.
Does there have to be a similar shift in wearables? Because obviously at least the first generation won't do as much as our phones, they're going to have to offer us something in exchange for that.
Matthew: I think your example of moving to the broach is an interesting one, the "Next Generation" stuff. Obviously, those guys are just spooling things out, "What will look good on TV?" and all of that.
But it turns out that there is a very similar thing going on where technology is moving towards proactive or passive information gathering with proactive elements. Anticipatory computing is kind of what I've been using to talk about it.
That involves devices like a wearable unit for instance with sensors on board and that mates with software to gather signals about your environment and gather information about you, and what you do, and where you are -- a.k.a. context -- and then it parses that to offer you information on the fly as you need it or more importantly, before you need it.
I think that there are some aspects of the current way that wearables work -- like the Pebble, for instance -- that work on a very much, strictly passive way, where they will pass on information from a phone, for instance, or they will tell you what the weather is if you ask it, or will do things on a timed basis. But timing, and push notifications that are just being sent from your phone, those are all fairly passive things. They're not proactive things.
I think that as we pull the thread on wearables, they're getting more powerful as far as the signals that they can gather, like location, and spatial awareness, and motion, and that sort of thing. Then you mate that with the software properly, and you're going to end up with something that provides a real additional, contextual, proactive value to the user.
I think that's the Rubicon we got to cross, and I think that's what the, hopefully, the next generation of products is going to do, is say, "This thing offers you a distinct value, a distinct life improving value, over not having it." I think a lot of the discussion has been based around, "What does this offer you over your smart phone?" I think that's the wrong question to ask, because nobody's getting rid of their smart phone.
If you sell it as an accessory to a smart phone, I think you're already doing it wrong. I think that's the wrong tactic to take, and I think if you're developing the product with that mindset, it's going to lead you down the wrong path. You're going to think about it wrong from the very beginning, you're going to develop the product wrong, you're going to solve the wrong problems.
Instead, you need to think, "What can we make that is going to significantly improve people's lives enough to where they absolutely must have it on the wrist, and they must have this device on their face, or they must have this device clipped to whatever?" That is the important question, and if I'm reading the signals correctly, I think that's one of the questions Apple's asking themselves about the iWatch is that...
Apple's not really in the business, anymore, if you could ever argue that they wanted to be, of making niche products. This is a company that makes products for massive amounts of people, and I think that any new categories that they enter, they're going to want to seriously think about, "What's the addressable market of this particular device?" The addressable market of a device that is an accessory to an iPhone is pretty big.
They've sold a lot of iPhones, or even an iPad. They've sold a lot of iPads, a lot of iPhones. But that's still...they keep too small, I believe. I think that what they want to do is produce a device where the addressable market is everybody that has an iPhone, everybody that could gain additional value out of this...whatever this device can do for them, say health monitoring, proactive warnings, or proactive suggestions about activity, and health, and that sort of thing.
Then also say, "That is a reason for people to buy into that ecosystem." That's sort of what the iPhone did, it offered people a reason to buy into Apple's ecosystem like, "This is an awesome smart phone, you can get into that, and then you can buy a Mac, and then go from there."
Rene: There are a couple of things to break down in there. One is, when you talk to someone like Eric Migicovsky from Pebble, he says that we're still in the Palm-5 days, which will give a chuckle to anyone who remembers those days of wearables.
Apple is usually a really patient company. They wait until a market is established enough, like you said, that they can start selling hundreds of millions of products, and that the existing products suck, and they think that they can make a better product that would solve the problem in a much better way than what exists at the time.
Like that famous slide Steve Jobs put up of existing Blackberries, and Trios, I think it was a Windows mobile phone, a Moto-Q, or something of the time. Then they make the case for why Apple can do this better.
Do you think we're at the point where wearables suck enough, but are big enough of a market for Apple to enter in the same way they did with phones, or do you think we're still in the stage that if they enter it, it will be a more ecosystem in enhancing play, like an Apple TV?
Matthew: They could go either way. It is a choice available to them that they could approach it as a hobby, or as an ancillary business to the iPhone, but I think that the market forces at work, and their philosophy towards expanding outwards from their initial categories, or the categories that are in play right now, are going to force them..."Force," I guess, is the wrong word. They will encourage them to think of it as a larger item, a mass market item.
There are already some examples out there of what wearables can do. I don't think any of them are particularly terrible, the ones that are successful and that are widely adopted -- the Pebble, Fitbit, and the Nike FuelBand, to a degree. A couple of the other ones are pretty decent, but there are a lot of examples of some pretty rough stuff out there too.
The allegory is not exactly comparable to the smartphone market at the time that the iPhone came on, but it does some definite comparisons. A lot of the smartphones that were out before the iPhone was released did a lot of the things that the iPhone did, and more, but they did them in a way that was...
Matthew: Yeah. It wasn't forward thinking. I had a Windows mobile phone, several of them. I had a Palm and Treos and that sort of thing. Those devices were not bad to use. I liked my Treo a lot. The difficulties came when you took the operating system or software that was running on the device and then pushed it up against any real-world challenges.
If you used their initial software install and just used their calendar, their contact list, and preloaded whatever they decided to put on it, it was probably OK and you enjoyed it.
The second you started loading apps, trying to transfer files from one device other, or basically matched it up against any cool real-life things you wanted to do, all of that stuff started falling down. They didn't have that holistic approach to getting everything to work right together and being able to hold their horses until it did, until they could deliver or at least try to deliver on something that was a clean, simple experience.
I remember transferring a 20 or 30-second video clip from one Sony Ericsson T68i to another or something like that. It was like 35 minutes via infrared, and if you moved the phone out of alignment slightly, forget it. You're done.
Rene: ...contact was 35 minutes sometimes. [laughs]
Matthew: I know, exactly. That kind of stuff was painful. What we end up with wearables is that you have certain things that are...People have learned a lot of lessons. I hesitate to say this. I'm not saying that Apple is the only person that could build good hardware, but a lot of these companies have learned some lessons from Apple in building polished, holistic products where the hardware and software work together to a goal.
I don't want to make that seem like Apple is the only company that can do that. There are others doing it. But they were definitely one of companies that set a standard there and set a stake in the ground. They set it a long, long time before it became evident that it was actually the right way to go.
The key to that switch being flipped, where all these companies started saying this is the right thing, is when all of the tolerances for crappy experiences and bad meshing of goals versus realities, all of that got heightened when smartphones became our primary computing platform, when that shift started to happen.
Of course, all the tolerances got really tiny. If something is annoying on a desktop, maybe something's out of alignment a little bit, you move the window a little bit. You've got a mouse. You've got a pointer. You just click and move. The tolerances are all very broad there.
It's like building a deck in your backyard. You hammer the boards in. If the gap is one-sixteenth larger on this end than it is on that end, no big deal. Maybe you drop a corn chip and it goes through on this end and gets stuck in a crack on this end. It doesn't matter. It's not a big deal.
But if you look at it in terms of putting a wood floor in your home, and you've got the tongue and groove. The tongue slips inside the groove. The slat flaps down. You put the next one in, and you kick it in with a rubber mallet to make sire that there's no gap at all. When you're done with the nice wooden floor, if there's a one-sixteenth gap on one end, you're going to notice. This is a whole different matter.
That's what happened with mobile devices. You end up with a situation where the tolerances for sloppiness or software and hardware not meshing quite right were so much smaller that all of sudden Apple's strategy, when it came to Macs, for instance, paid off immensely, very, very quickly, in a very obvious fashion. People started to adopt that and go for that.
What we have in wearables is, people have learned a lot of those lessons. They're trying really hard to create those kind of experiences. They've got apps that work well with the device. You don't have to use some third-party app to interact with the device. Those device makers are making their own software and making their own hardware. Which a lot of people take for granted now, but that was definitely not the case in the Palm days. To get a third-party contact manager was an immense undertaking.
Rene: It would cost $30. It would crash when it launched, and it would crash when you closed it.
Matthew: Exactly. You had to give your credit card number to some crazy, random app-store thing on the Web. You never know what was going on with them and what you were going to get.
A lot of those lessons have been learned, so they're starting off in mach better place now with this first generation of wearable devices. Now, we are seeing the difference between something that's well executed...
Like a Fitbit, for instance. They execute really well. It's a nicely designed product. Almost nobody I talked to hates them. But I also talked to a lot of people who stopped using them. It falls off. Maybe it helps it helps them achieve some sort of awareness of food intake and activity, and then they develop a habit. It takes about six weeks, they get into this pattern, and now they don't need or use it anymore. They don't need it to remind themselves to exercise or whatever.
Some folks I've talked to have even said that it becomes a burden. You always have to remember it. You're always checking it. Then you start trying to please the machine instead of actually being healthier. You're just trying to feed it steps and arm motions and whatever it wants.
The FuelBand is funny that way. You've got a hand motion that doesn't attribute itself to cycling, for instance. There are a lot of weird things that are happening with wearables. They don't create a desire to continue using them. They don't create that feeling of, like this is improving my life, I need to have this, this is actually materially making things better for me.
I think that there are edge cases where that happens. A buddy of mine drives for a living. He's a sales guy and he drives for a living up and down the valley here. His Pebble is on his wrist all the time because he gets his texts and all that stuff, he doesn't have to look at his phone. His car doesn't have a fancy heads-up or anything that has text coming through on it. Instead he just twists his wrist and there it is.
That materially improves his life. Because he doesn't have to take his eyes of the road and it could improve his safety...
Rene: I have a friend who works in the hospital, the same thing. She can leave her phone in her office and just use a Pebble as she walks her rounds.
Matthew: Right. That's a whole another thing, because a lot of hospitals have rules against carrying phones and stuff. There are a lot of cases where something like a Pebble could be a real advantage.
But I think that those cases are too slim right now. What we need to see is somebody to make a pitch, let's call it a pitch. By pitch, I mean bring a product to market and explain this idea.
Rene: Like the iPhone when Steve Jobs made the case. That is has to exist between your phone and your laptop, it has to deserve to exist.
Matthew: Right. It's not just because we could fill a product category. I think that's why a lot of people misunderstand why Apple's not delivering like larger screened phones and stuff yet is there's definitely, they have the ability to do it and they could do it. Maybe they were just not seeing that people wanted them and they got surprised when people liked the Samsung stuff. I don't know. Maybe that's totally true.
But I think that there is a situation where you have to find the merit and then you have to deliver on it with whatever your given level of desire for polish is. The first Nest, for instance, I think that met their desired level of polish for a thermostat, but then the second one was a lot thinner. The joints were a little closer and a little finer tuned. All of that stuff was one step above the MVP.
But I think that the first one was very well done and it had a certain level of polish, but they could have gone 20, 30 percent worse than that first one and people still would have liked it.
I think that's the difference between a company like Apple or Nest or some of these other modern hardware companies that are developing stuff that's really neat and other companies is that they have the cojones to wait, to say, "We're going to not ship this until it gets to a certain level that we want." Whereas other companies will say, "Hey, let's just ship a bunch of stuff and see what people like."
I don't know that either one is evil. There's no black and white here about what's the better method. I think a lot of people will argue, like, I loved the fact that I could choose any screen size in a Samsung. I like their interface or whatever and I could choose from all of these 10 devices, and I think that's fine. But I think that focus also has a lot to do with it.
Bringing that back to the wearables thing, I think that if you end up in a scenario where you're shipping something in a wearable, you need to make sure that it's at a level of polish where it works absolutely wonderful, because it's even more so, it's under that same kind of scrutiny that we have with a smartphone. The tolerance is even smaller in something that we wear, even more than...
You've got the Mac at one end of the scale or a desktop computer at one end of the scale and then a smartphone at the other. Well, the wearable is even further along. It needs to be even tighter and even more refined.
Rene: That's actually interesting. If you look at something like, and I'll use just because they're easy, the iPhone or the iPad, they first came out, they were not the devices Apple wanted to deliver. They were the ones that they could realistically polish up to that level.
Then you have the 3G and the 4 eventually and you have the iPad 2 and the iPad Air which kind of fulfilled the original vision of those products. Arguably the dream and the magic, the same thing for Android when you get to the Nexus One and you start to see what they really wanted to do. Can people afford to do that in wearables, or because they're so small and so personal they have to take it to that next level of product sooner?
Matthew: If you look at the FuelBand, for instance, if you just pull that one out, the FuelBand is a pretty good product. I've used mine pretty heavily. I haven't worn it recently for a variety of reasons but I did wear it pretty heavily for a year or so. I think that it's a pretty good product. But they're on continuous hues. There are a couple of things that are interesting about it.
One, it's super thin and super light until you wear it a lot. Then it becomes heavy and bulky. It's one of those things where obviously you acclimated to having it on your wrist. But I think it's still bulkier than it needs to be. Not needs to be by engineering standpoints, because I know, I'm sure there are some very smart people working on it that made is thin as they could.
But I think that the tolerances, once again, are very small there for something that's really bulky on your wrist. The latch is really well-designed. It's metal and it's got this nice click when you close it. I'm actually opening it and closing it right now.
You end up with this situation where sometimes when you bang your wrist on something, because the latch is activated by inwards pressure from the outside, it pops it loose. I've literally been just walking around or whatever and all of a sudden the band is loose on my wrist and floating around.
That situation is just like a weird one. It's like I didn't unlatch this thing. It happened on its own. I think that some stuff like that, that might not be an enormous thing in another product like a bracelet or something like that, once you put $150 device on your wrist, you expect those things to be a little bit nicer, a little bit more thought out.
I think that this is one of those things where it's a company that has a lot of experience in wearables, so to speak. Nike's been building all kinds of wearable stuff for a long time, including "smart watches." Yet, they still have issues delivering a product that's really, really well done on all counts.
I think that the tolerances are going to be really, really tiny for anybody delivering any products from now forward. I think we're past the first flush of like, "Oh look, this is cool, I can wear it and it does stuff and it reminds me that I took so many steps," or whatever. I think we're well beyond that and I think whatever products get delivered from here on out are going to be pretty polished experiences to capture any meaningful share of a market.
Rene: Some people have asked or wondered out loud, our phones do so much now. For example, there's an M7 processor in the iPhone and you can get a Fitbit app. You can get @_DavidSmith's pedometer app. You can get a bunch of stuff that does a lot of what the dedicated wearable devices do. Since the phone is in your pocket already, do you really need a wearable to do it?
That leads into what you mentioned earlier with the contextual awareness and the sensors and things. Is there a possibility, for example...actually, let me break that down a little bit quicker.
I'm sure Apple is doing lots of fantastic things. I'm sure Google has a lot of plans. But they're not only thinking about the first release or the minimally viable product. They're thinking of the second, third, fourth generation. The stuff they're working on now, we might only see in a couple of years.
Is there still a case to be made for things like blood sugar reading or hydration level reading? Sticking your phone on your wrist is not the best solution, and there are better ways to deploy those sensors beyond the mobile phone.
Matthew: I think that's a very solid product argument for a wearable device is, what can it give us that the phone can't? The difference between something being in your pocketed and attached to your body can vary from use case to use case. It doesn't matter if you're getting a call on something that's attached to your wrist or in your pocket. That's not necessarily a big deal.
As I mentioned, there are some cases where it's helpful to know, so you don't even have to touch your phone, who's calling you and that sort of thing. But that particular interaction, it doesn't make an enormous difference whether it's something that's in your pocket or something that's on your body.
But there are other things where it does make an enormous difference. Like, what are the things that a device that's attached to your skin or touching your skin could tell you that a phone can't unless you're holding it?
The theory is you could put a heart rate monitor or a temperature sensor in a phone that could tell you about your body if you held it. But that will offer you a snapshot.
What the device that's on your wrist can do is it could provide you contextual information about your body, including things like heart rate monitoring and temperature sensing and blood sugar and all kinds of other interesting stuff. If it's attached to you, it's doing it on a continuous basis. It's providing you with a historical chart of that information.
That, to me, is interesting, because it's not really about, like what can you do, because you could go to the doctor and get your heart rate taken and your blood pressure taken and all that stuff. What happens when you're able to access it continuously and that information is able to be charted over time, historically, minute-to-minute? Then you end up in a scenario where you've got a chartable plot of data, and then you could compare it.
Once you've gotten a certain amount of historical data, then you could start comparing it against current data. That provides you with a differential, percentage of change or a change in data points.
Now, you're looking at not just what is happening to me right now, but what has happened to me, and then what may happen to me. Like you're on course to lose five pounds over the next month, or you're on course to gain two pounds over the next two weeks or whatever the case may be.
That kind of stuff I think offers people access to data passively. They don't have to think about it, they don't have to enter their weight into a chart or enter their heart rate into a chart or do all that stuff manually. Instead they can allow a device to gather it for them over time.
Then the key is you've got to deliver to them in a human way, which is sort of what Nike was trying to do with their Fuel, but I'm not really a fan of that. Because they take it and they boil it down into this Fuel number. That's not calories, it's not any particular measurement that mates up with traditional health measurements. It's its own thing, this count of Fuel.
But what they were trying to do is make it humane. You say like, "Look, here's Fuel. We're not going to put labels on it. We're going to tell you this," and then you could compare this against Fuel that you gain in the future, Fuel that you've gained in the past. Then it gets you an overall chart.
That's the key. It's got to be taking all these data points and then giving people actionable, human-parsable information that they can use to make their lives better.
I think that in that situation, having a device that's able to continuously monitor and continuously gather that information is very important, because it means that they can then provide you with very human, very parsable information about what the history of your health has been and where it might be going in the future. That proactive sense of what this contact is giving you and where it's going to lead you in the future I think is the selling point or the pitch for a device that's wearable.
That has to be the pitch. It can't just be, it tells you how many steps you've gone, because most people have no context to talk about that and think about that. What does it matter? Instead it's going to be how people delivering on gathering that information and then comparing it and then providing it to people in a very human, very understandable way.
Rene: Our every-cynical Internet is arguing that we...maybe not arguing but maybe asking if we as a society or as a culture care enough about our health to make this a compelling feature for us, or if we'd be more likely to buy it if it simply spat out McDonald's coupons?
Matthew: [laughs] That's a good question. I don't know. I don't know if people care enough about their health to do it. I have at certain times in my life cared more or less. Sometimes I'm running after my daughter and I'm out of breath and I'm like, oh, god, I need to get back to the gym. You'll come to moments of awareness like that. Then other times, you'll be scarfing down a double double and I don't care, [indecipherable 0:31:52] good.
Rene: I follow your Instagram feed. I know exactly how good it looks.
Matthew: Yeah, yeah, exactly. I love food and I love eating and I don't like people reminding me all the time that what I'm eating isn't exactly super healthy. I don't know. It's going to be an interesting thing. I think that's part of the challenge is to say, look, everybody should care about their health in at least some fashion.
Balance is everything. I think a lot of people, especially on the Internet, value extremes. Like I would never or I would always. If you ever have read a single self-help book, it's like, "Never deal in absolutes because it'll disappoint you and destroy progress at all turns."
I think moderation is good for everything, but people have a hard time with that. Our nature is to be lustful after everything. Everybody loves to eat good food and loves to relax and loves to be hedonistic about things now and then. I'm not sure. I'm just not sure how the balance of people's lives and the way people think about their health, and we're all so busy. I'm not sure if we have room for that.
Rene: A depravity might sell better than a health band.
Matthew: Exactly. Yeah, the sloth band, the gluttony band. Maybe, and maybe that's the pitch. Maybe the pitch is everybody needs to think more about health. We know you don't have a lot of time or necessarily the knowledge to interpret the signals that your body is giving you and that your lifestyle is giving you, so let us do that. Let us come in for $150 or whatever the case may be, strap us onto your wrist and we will help you with all of that.
So that you could get about your busy life, get an update once a day or week or an hour or a month or whatever the case may be, that tells you, look, this is the way everything is going for you. These are some suggestions about the way you could change things to make it better.
Maybe that's the pitch. Maybe that is the key. Appealing to people's busyness and their lack of understanding and their lack of time. Really, it all comes down to time...
Rene: Or it's just the gym card membership. Everyone will buy it because it makes them think they're doing something, even if they use it or not.
Matthew: [laughs] Right. The gym business is enormous. People spend billions of dollars. It is very, very common that people buy gym memberships and keep them and don't cancel them simply based on wishful thinking. That's an enormous part of that. Gyms factor that into their profit margin. If we can get people to subscribe, then they'll never cancel because of what they wished they were doing, but not what they are doing.
Rene: The deadbeat customers for them are the ones that show up.
Matthew: Exactly. Those are the people they resent. They have to clean the machines. They add up to wear and tear on the facilities. Yeah, absolutely.
Rene: One of the things you mentioned earlier is the, that we're talking about the sensors. That part fascinates me because arguably microphones were dumb for a long time and then things like Siri and always-listening Google Now came up. Now, they're context-aware, they can do sequential inference, they can do all these things that make them more convenient.
We had cameras, but now we have Connect and maybe PrimeSense. Cameras can start to see who you are and how you're moving. The technology for wearables, maybe we had surfaces on all our devices but they were just surfaces and maybe now they'll be able to read and tell things about us. How important do you think that sort of awakening of turning components into part of the processing system is going to be for us?
Matthew: I think that's the lynchpin. I think you nailed it. That is exactly the building blocks that people are going to be using in the very near future to expand the way these devices work for us, or they're going to hope that's what they're going to do, anyway.
Really, in the end, everybody's just trying to sell more product. We can't attribute everything that they're doing, every technological advancement to some sense of altruism or advancement for humanity. But aside from the bottom line, let's just pretend for a minute.
I think that that's the theory behind all of these, this kind of next generation of advancements when it comes to contextual computing is that we're going to be able to use the sensors on our devices, which are getting more and more powerful and more accurate with every generation, to provide us with a deep amount of contextual information that we'll be able to use to make our devices more intuitive, more friendly, more proactive about the information that they give us.
I think that obviously Google's been doing that on the data side for a while with Google Now and I think they're doing an amazing job there. My Android device that I used is primarily a Gmail and Google Now machine. That's what it does. I prefer browsing on an iPhone simply because it's a better browser, for the most part. Chrome is decent but I still like the iOS browser and there are certain experiences, app experiences that you're only getting on iOS still.
But I think that there's just an immense amount of joy that I have from using, getting a notification from Google Now that anticipates my needs before I even...or I'm pulling up the phone to check and it's already there. That's great. It's just really, really a magical experience and great stuff, and stuff that only Google could do with their dataset.
Now, I think people are awakening to that. I think that it's going to become mandatory. I really don't think that contextual computing is something that's going to be the purview of one or two companies. I think literally every company that enters this space is going to have to investigate taking advantage of the sensors and the information gathered by sensors, and whatever user a data a user is willing to part with or share with them to provide them with that anticipatory computing experience.
It's table stakes now. This is not an option. I think something like Google Now, for instance, with Google, that's, they have a great fundamental building block of their next generation of mobile devices.
With Apple, I'm still waiting to see. They have some stuff, and I think they've be making a couple of small acquisitions that could help them in that space. The today section, you could easily imagine where that could be fleshed out with a lot of contextual signals, especially if you use iCloud or Apple's native mail app or anything where you're passing large amounts of data through it.
It doesn't have to be an Apple mail account. They've got the data natively on the device there so they have signals that they can use to help you flesh out that section. They don't do a whole lot with it now. I think that's probably because iOS 7 was the big rush to just ship, and that's fine. But I think that there's a lot they could do with that and will need to do with that.
I know Microsoft just cut a deal with Foursquare for a lot of their data. They're going beyond the APIs to go deeper into Foursquare's anticipatory stuff when it comes to location. Where exactly you are, why you might be there. Intent-based computing is huge.
Knowing why somebody is where they are or why somebody is trying to get somewhere will help a lot, I think. Will alleviate frustrations that people have with using their devices and provide moments where they can be delighted by stuff.
The sensors really are the hardware component to anticipatory computing, to the contextual-based computing. That's that software side of stuff. I think we're going to see insane advances, and sooner than you may think, when it comes to sensors. We know that Amazon's working on a device, a phone with several cameras. They're playing with that in the labs and trying to get that to a shippable state that can do some spatially aware stuff.
Then you've got companies that are like PrimeSense. Apple acquired PrimeSense obviously. A lot of those companies are built around sensors that can gather more than just visual information. They can gather spatial information and depth mapping. They can separate objects out from the environment and that sort of thing, so they could tell what's a room and what's in a room. Lots of indoor mapping stuff like that.
The major issue with a lot of those companies so far has been their power consumption. There's an enormous amount of power, in context, needed to run those devices. I mean they run off of USB, but we're talking like a full watt of power -- whereas you need a few milliwatts if you're going to be putting it on a mobile device with the batteries being in the state that they are now.
But I think those hurdles will get overcome. I think that people will find ways around that and fine ways to create technology that enables them to put those devices right into a mobile phone. Connect in a mobile phone, let's call it. I think we're going to see that stuff very soon. I think it's going to blow people's minds what it can do.
Now, where it goes from there, what experiences it provides and whether those are something that people quote-unquote need or want, I don't know. But I think that's the next big thing that everybody's going to be exploring. That's what I think this next 12 months or so is going to be about for a lot of these mobile device companies.
Rene: What's interesting too is if you start looking at the kinds of interfaces being built. Whether it's the car, like Google Now's implementation of cars, or iOS7 and the dynamic interface they're experimenting with and iOS in the car that can move it forth. It was bidirectional AirPlay, basically.
All the interfaces, I would go to the home screen, look for an icon, tap the icon, go to the home screen, look for a widget. I would always have to just pull process. I would have to go find the information. We're finally getting information that comes to us. If you can add context to that, the idea of push interface suddenly gets really interesting with that prescience that you mention.
Whether that card is coming up on my phone or if it knows I have a watch and that card is coming to my watch, it seems like we have the ability to take in that information, digest it, do something smart with it, but also we're getting the ability to give that back to us in a much more digestible form.
Matthew: Absolutely. I think a lot of that comes from the influence of apps. I think that the atomic units of information on the web were all over the place and still are. You don't know if it's going to be a feed or a stream or a module or an embed or a clip or whatever.
There are a lot of different atomic units of content on the web. I think that apps taught everybody that there could be a finite sort of module of content that your screen, your rectangular device screen could present you everything in one frame. I think Apple had a lot of influence there with the way that they developed UIkit and made the screen a frame in which to paint pixels.
Of course, there are a lot of apps that still use scrolling and feeds and stuff like that. But I think that they actually painted a very distinct image of what's possible on a device screen and that you can treat that like a discrete element.
I think that the cards fed off of that. A lot of these cards interface stuff feeds off of that. They provide a distinct atomic unit of content, of presentation, that we can look at. I think that there were examples of cards before apps really came about and stuff, but I think that it really took off from that. That people understood it. It's easily parsable.
You don't have to worry about there being anything you can't see. It's not this large window. If you've ever use a remote desktop client on an iPhone, for instance, you know there's more out there and you're scrolling around with your little finger trying to click the reboot button on your server or whatever the case may be.
That feeling doesn't exist with mobile apps. What you see it what you get. You've got a window and you're being presented with all of that content right there, and it feels very finite. I think cards tap into that same kind of thing.
I think they provide easily parsable units of information that we can get sent on various devices and they still maintain cohesiveness. You could look at that card, like a Twitter card on the Twitter feed of your desktop and look at it on your phone and it's roughly the same exact atomic unit of content. I think that's a powerful thing for web companies, but it's also better for humans, because the cognitive load is lessened and your framework is set so you don't have to worry about the interface as much. That fades out, and you could concentrate on the content rather than, where are the buttons and where's the stuff and am I seeing it all?
Rene: In all the movies that we used to watch, there was this idea of artificial intelligence where this machine would be very human-like. But what we're seeing with these sensors is they're not artificially intelligent. They're just well-informed.
They're getting more of the contextual data you spoke and they're handing that data off. Like an M7 chip can hand that data off to an app that does pedometry or something or a context-aware chip on a Moto X can hand that off to something that does natural language processing.
It's not the sort of scary Terminator coming to kill you sort of thing, it's this web of information that you can benefit from. It seems like the evolution is not the scary thing that we were afraid of, but much more of a, almost like a...I don't know what the right word is, but a very well-behaved butler who's just helping you through your day.
Matthew: I think that the guys that work on "Her" did a pretty good job of that stuff. Even if you just look at the trailer, you could see there's a vision there of computing that's significantly different than we may have come up with a few years back.
Let's go back to the "Star Trek" thing. They have the lapel buttons or the lapel pips or the broach for communication. The lapel pips I think were further on down the line in the timeline, but let's not go down the rabbit role too far. They have the broach that they could tap on.
But there's no discernable interface besides the action of tapping the button. It's all voice-activated. With that, they strip away all of the elements of interface and flash. In the same program, of course, though, they had the pads, which everybody likened to the iPad when it came out. Like, "Hey, look, there it is! This is our sci-fi tablet."
In a lot of ways, it is. That's exactly it. It's that thing where we all imagined that we would have something we would interact with and then we would tap on it and that would give us the information that we wanted at our fingertips. That's what we thought was the ultimate. Like, it's at our fingertips. How much better could it be?
I think that aspect of technology is...I don't think it's reached its nadir yet. I think my daughter will grow up with a large amount of touch interfaces in her life, still. I think those will be around for a long time and I think they haven't reached their peak, the top of their Big Mac curve.
Rene: The 5S and the Moto X are the worst phones she's ever going to know, Matthew.
Matthew: Yeah, exactly, exactly. I think it's just going to get better there for her. But like in the movie "Her," most of the interfaces are not seen. Most of the technology is unseen. There are devices that are needed for inputs and to carry around the computing power necessary. But most of it is, it fades away.
That's the next thing beyond this current wave of writing. The current wave of writing is everything at your fingertips, interacting like a physical object would but on a screen. Then the next one is not interacting with an interface at all, erasing the interface completely and utilizing your normal human interactions to get what you need out of a computing device. We could spool that thread out a long ways.
But I think that's what I'm seeing here is that our current generation of stuff, there's still a long way to go, a lot of refinement to do. Until I get an iPhone that's as thin as an iPod Touch, I'm not going to be too happy. I still see that they've got room to grow there.
But out beyond that, I think that the next iteration is really going to be about technology fading away completely and offering us with just the information we need proactively. Saying, like, "Here's your appointments for the day. Here's how you get to where you need to go," without us having to go, "Hello, Google, how do I get where I need to go?" Not having to ask is the next step.
Rene: The obvious question then is how much...the price we pay for this is not really in money or time but in privacy. It's in getting past the idea that you control everything about yourself, because if you don't share that, you don't get the utility. That still seems to me like another part of the case that has to be made or another hurdle that has to overcome. I wonder if companies, if they'll be different cases.
For example, if Google can say, "We're giving you so much," or, "It's such a great experience, you're going to want to share your data with us." If a company like Apple can say, "We really want to make sure that you have the sense of privacy. We don't do as much as some other companies, but you can feel safe in what we do do." If there'll be different arguments that could be made around that.
Matthew: Obviously, privacy and security are never going to be too far away, once you start talking about any of this. Especially since we all became very well aware of how little privacy we actually do have. Most of us expected that or suspected that, for years, but now we know for sure, in many ways, that nothing we put online ever is ever completely private.
I think that's an interesting way to live. My daughter will grow up in a world where she assumes that nothing she puts online is ever private. I'll teach her that, because that's what I've always believed. I've tried never to really put anything online no matter how private it seemed. But yet, I still do financial transactions online and all that stuff, and that's there. I'm no fool. I know that all of that stuff could be hacked, could be accessed, given the proper tools or time.
But I think that we definitely are going to...we're just really new to this whole thing. I think it's going to be very, very difficult to reconcile what we're going to be getting from these companies versus what we're giving. Any reasonable thinking human...I don't really think that we are going to be...To ever really be able to get value out of a company like Google, that we'll match the value of the user data, of our digital souls that were giving them.
The question is really going to be how we can reconcile that. How we can say, "Look, I'm giving you way more than you're giving me, but I'm OK with that," I guess or I'm not.
Rene: The deal with the digital Devil.
Matthew: Yeah, exactly, and I think you're going to end up with a situation where there will be a major backlash against it. There already has been to the degree, but I think that what we're seeing right now is a lot of people are seeing it as the tech companies against the government and that they're on our side, so to speak.
I think we'll probably end up with a situation where, in a couple of years, there will be a significant amount of the populous that will have some sort of real aversion to that. I think we're already seeing some of it in this rise of ephemeral and similar private messaging services.
Consumer Internet is not always a direct indicator of what the public is thinking. Sometimes, it's just a fad. We'll have to ride this one out and see what Snapchat, and Whisper, and Secret, all this stuff is about.
I think that we walked into the Internet thinking that there was only one way to really do things.
Rene: We were naïve.
Matthew: Yeah, we were. That's not a bad. We were ignorant. Everybody was ignorant, really. Everybody was creating the rules as we went along. There wasn't really anybody to tell us otherwise.
I think we walked into this idea that everything we do online is permanent. Everything we do has to leave a trail and a record and will be parsed and used to service advertising, and all this stuff.
That's the Internet that we grew into. Maybe it's boiling the frog. Maybe we should have known sooner or should have reacted sooner and the Internet would be significantly different now. I don't think it would be as big. Now, we're seeing, I think, the early twinges of the next generation of people.
A lot of these apps are very popular with the young, like Snapchat, who are becoming cognizant of the fact that they don't want who they are, when they're 14, to be as easily accessible as who they will be when they're 25.
It's definitely an interesting point in the way the world is online. I think that big companies like Google and Apple, will have a lot of hard questions to ask about how they want to approach these things, what the value is that they're offering the user, what risks they're putting people at with how much data they do collect and keep.
I think that right now, Apple and Google have significantly different philosophies because Apple is a hardware company and does not need user data to make money. I mean credit cards are sure nice. That's user data.
Mailing addresses, and names, and credit cards are the backbone of iTunes. Apple has more payments power, nascent payments power than any other company in the world.
When they leverage that, they're sure going to love the fact that they have that user data. I don't care if they are making most of their money off of hardware. If they're able to become a payments processor and make the three percent that Visa's making, off of every transaction, all of a sudden that user data is very, very important to them.
I respect people at Apple. I know people that work there. I don't have any sense that there's any nefarious ways that they think about user data. I get the sense that they respect user data. Historically, they have because they don't make their money that way.
There are possibilities for their business, in the future, that do involve them utilizing some user provided information, voluntarily, provided information to make a lot of money.
Matthew: Potentially even more than they make from an iPad or a Mac. Let's not talk about iPhones because they're just making so much money off of them now. It's definitely an interesting thing.
I'm not the kind of person that can ever look at Apple and go, "this company will never use user data," because A, they already do, in some ways, and then B, I don't necessarily think it's a bad thing.
It's a morally tricky situation. I don't think Apple is going to be able to take the high road forever, in terms of, "Oh, we don't use user data," because they have said publicly we don't use data serve ads so we're essentially a better person than a company like Google is.
While I don't personally believe that, it doesn't matter really. That's the stance that they're taking because they sell hardware. I don't think that they're going to be able to maintain that position for long. Then you've got pretty much all the major companies win...
Excuse me, you've got all these major companies, almost all using user data to provide experiences, then you start asking those really hard questions about how much worth we're getting versus how much we're giving. I think it's going to be a very, very interesting series of questions to ask.
Rene: One of the other interesting things I want to pick up on that you said was you're going to have to explain this to your daughter, and it made me think of regardless of which company gets into wearables, right now Google glass is a niche product, Samsung galaxy gear is a niche product, the Pebble is a niche product, the iWatch doesn't really...No one's shown an iWatch off on stage yet, but if and when these products start becoming mainstream, the case is going to have to be made at retail, or in some way to consumers.
One of the things you mentioned previously was that the stores, the end points, whether it's an Apple retail store, or a big box store, or we know Samsung is trying to get into retail, and Google had some barges, I'm not sure what happened with those -- maybe Jaws.
But if they're going to require a way to make this stuff accessible, and understandable to the mainstream, and that may not be the old model where they just had computers on shelves.
Matthew: Yeah, absolutely. I think that Apple's decision to go after Angela Ahrendts, and hire her, has a lot to do with how they're going to be positioning themselves for the future, and unlike a lot of people, I think that some folks, when they write about the Ahrendts hire, are a little optimistic about how much input she's going to have in the product development process, that's just...
Rene: Yeah, they won't be any [indecipherable 0:58:26] iPhones.
Matthew: No, that's just not really how Apple works, but I'm sure she'll have input as at SVP, but I just don't think that it's going to be her in the lab with Jony Ive as much as people think.
Aside from that though, she's got a lot of qualifications that are just pure. She's a great CEO, she pioneered growth, she pioneered growth in China, she has a lot of great qualifications that are totally separate from Burberry, the fact that she was a CEO of a fashion company.
But let's put those aside and say, "OK, that's great." She would look great on paper regardless, but now you've got an additional layer of context when it comes to Apple and what they're selling. Over 50 percent of Apple's profits come from the iPhone now. Don't quote me on that, it's either profits or revenue, whatever. It's over 50 percent what they make...
Rene: They make a ton of money, a ton of their revenue.
Matthew: ...in general comes from...Yeah, the revenue is probably what I'm talking about, comes from the iPhone. Anyhow, they make an immense amount of money from the iPhone. The iPhone is a very personal computer, and I use that in the colloquial personal, not "PC, personal computer," but it's a computer that is personal to us. That is a very important factor when it comes to selling iPhones.
I think products like the iPhone and iPad, which make up the majority of Apple's sales now, have been adopted by the retail stores on the fly. They've bolted on the ability to sell those devices to consumers, to stores that, while great, and while they are wildly successful, and sell more per square foot than Tiffany's, they still were not designed to sell iPhones and iPads. They've undergone some revamps over the years.
The tables have got squarer, and the boxed accessories shelves have gotten smaller, or whatever the case may be, but there's never been a real rethinking of Apple's retail business centered around selling personal, with the lowercase P, computers. I think that selling the iPhone and iPad and selling, say an iWatch, are going to require different ways of thinking. Selling them into the future is going to require different ways of thinking about Apple's retail business.
I think that's one of the reasons that she was hired is because they...The computers that we're using now are extensions of ourselves, and they are very, very personal devices that we use, that we have in our pockets, that we touch, and hold, that we wake up and grab first thing in the morning, before anything else, before coffee, sometimes even before our glasses, and then we realize our mistake or whatever the case. We touch them constantly, we feel them, we hold them. They are an extension of us. People put skins on these, and cases, they want theming on them.
A lot of Android users love to theme and customize their devices because they feel it makes it more personal, and iPhones have famously had a very limited set of options for people wanting personalization, which is why the gold one is so popular, because everybody will know it's the new one, and it's different.
You get in this situation where perhaps people want these personal devices to be extensions of themselves, and to reflect their personality, and you need somebody that understands that in order to sell those devices. Yet, at the same time, you also need to have somebody that understands the value of a discrete, understandable brand identity, which Ahrendts did at Burberry very well, where they had a lot of copycat products on the market. She was able to clear a lot of that up and restore some prestige to the brand, and that sort of thing.
I think this definitely dovetails nicely with Apple's efforts to make their retail environment and their retail efforts more suited to selling personal computing devices like the iPhone, like the iPad, and like a wearable device such as the iWatch.
Rene: Is this something that's going to have to play out across the industry? We saw, I think Samsung bought Carphone Warehouse in the UK. I'm really hoping they by The Shack, because that place needs to change.
Do the companies that have these consumer-products, are they going to have to have one on one, face-to-face relationships, or do you think that enough of our consumer habits are moving on the web that...?
Retail might be fine for Apple, but a Google, or a Samsung, or a Lenovo, or a whomever, Microsoft, doesn't necessarily need that.
Matthew: That's a good question. I think there's a lot of people suited to answer that question better than me, but I'll take a little minor crack at it. You've got products that are very personal, very critical to our daily lives, and so I think there will be any need to have your hands on it, and have somebody to talk about it, and have that kind of face-to-face interaction.
I don't think that Google, for instance, could survive forever on selling online only Nexus devices, if they want to pursue that route. If they want to pursue direct sales, they're going to need a retail presence of some sort, at some point, before they turn that corner to really having a brisk retail business, if they decide to pursue that. They're in a really tricky situation when it comes to [indecipherable 1:04:02] partners, so we'll see how that plays out.
But if you've got companies like Samsung investing a lot of money in establishing retail environments, I think it's self-explanatory. Yes, they feel that is necessary. I feel that it's necessary as well.
If you have a device that you buy online, and you buy it because of reviews, or whatever the case may be, that's one thing. But being able to walk in and pick up a device, and use it, and hold it, is an enormous sales tool, even if you end up doing the final order online.
It's very, very difficult to describe some of the intangibles of [inaudible 1:04:41] it. I worked in retail for 10 years, and I sold...
Rene: Why I'm asking you. [laughs]
Matthew: ...digital cameras, and...Yeah, [laughs] yeah. It's hard to describe the difference between getting something online, and picking it up in a store, and having somebody there to just guide you through the process as long as they're a decent salesperson, and relatively knowledgeable. But it's very, very important, and it really makes a difference between a sale and not a sale a lot of times.
I think that's underestimated by a lot of people, because they see Amazon just owning these commodities markets. Why go to Target to buy toilet paper if it's the same price with free shipping at Amazon?
Rene: Yeah, your hundredth role of toilet paper is way different than your first.
Matthew: [laughs] Right, exactly, exactly. I think that that gets confused a lot with buying a phone, a very personal device.
I think that it's definitely something that all of these companies are going to have to look at and explore, and I think that it's questions Apple's going to have to answer about how it wants to change its retail business, if its primary business is going to be personal computers, with that lowercase P once again, like the iPhone.
I think it really needs to reevaluate how it sells those devices, and how it treats training, and sales, and all that stuff, and how it presents it in its environment. It's going to be an interesting time I think. I think she's going to have an interesting road ahead of her.
Rene: 2014 is going to be a hell of a year.
Matthew: [laughs] Yeah.
Rene: If people want to read more of your writing on this stuff Matt, or they want to follow you, where can they go?
Matthew: They can go to Techcrunch.com, I do write as much as I can, so they can read me there, or they can read all of the other lovely writers that we have on our staff.
Rene: The Twitter and the Instagram?
Matthew: Oh Twitter is Panzer, P-A-N-Z-E-R, and Instagram, if you want to look at food, is MPanzarino.
Rene: There are a couple of cute daughter pictures once in a while.
Matthew: Yes, yes, also my very lovely daughter.
Rene: All right, Matthew, thank you so much. I really appreciate you joining us.
Matthew: Thank you, sir. I appreciate you having me.
Rene: That was awesome.