The contextual awakening: How sensors are making mobile truly brilliant

The contextual awakening: How sensors will make smartphones truly brilliant

When Steve Jobs introduced the original iPhone in 2007 he spent some time talking about its sensors — the capacitive multitouch sensors in the screen that let you use your bioelectric finger as the best pointing device ever, the accelerometer that enabled the interface to rotate with the phone, the ambient light sensor that adjusted brightness to fit the environment, and the proximity sensor that turned off the screen and capacitance to save power and avoid accidental touch events when the phone was held up to a face. Over the course of the next year, Jobs also introduced Wi-Fi mapping and then GPS so the iPhone could chart its location, and later still, a magnometer and gyroscope so it could understand direction, angle, and rotation around gravity. From the very beginning, the iPhone was aware.

The awareness was bi-directional as well. As the iPhone connected to Wi-Fi networks, so could it be used to map more networks. As the iPhone bounced off cell-towers and satellites to learn its location, so could its location be learned and information derived from it, like traffic conditions. As devices got smarter, so too did the networks connecting them. It was one, and it was many.

Siri: Some things you can ask

Microphones had been part of mobile phones since their inception, transmitting and recording the sounds of the world around them. They improved with noise cancellation and beam-forming, but they came alive with Voice Control and Siri. An App Store app and service bought by Apple and integrated into the iPhone in 2011, it made the microphone smart. All of a sudden, the iPhone could not only listen, but understand. Based on something previously said, it could infer the context and carry that through the conversation. Rather than simply listen, it could react.

Google Now lacked Siri's charm but was also far more brash about its scope. Hooked into the calendar and web browser, email and location, and a bevy of internet information sources, it wouldn't wait for a request, it would push data when time or conditions made it relevant. Thanks to context and natural language coprocessors, it could listen constantly for queries and parse them locally for better speed and better battery life. For Apple, processing so much of our data on their servers and "always listening" to what we say no doubt set off numerous privacy alarms, but for Google and those willing to make that deal, it enabled an entirely new level of functionality. The phone could now be there, waiting not only for a touch, but for a word.

M7 motion permission

Apple introduced its own coprocessor in 2013 as well, the M7 motion chip. Not only would it allow the existing sensors to persist, recording motion data even while the phone's main processor slept in a low-power state, but through persistence it would enable new features. Pedometer apps, for example, could start off with a week of historical data, and no longer had to rely on external hardware for background monitoring. Moreover, the system could realize when a person switched from driving to walking and record the location where they parked, making the car easier to find later, or realize when a person fell asleep and reduce network activity to preserve power. It could also pause or send alerts based not only on rigid times but on activity, for example, telling us to get up if we'd been stationary too long. It meant the phone not only knew where and how it was, but what was happening to it.

Cameras like iSight have been slowly evolving as well. Originally they could simply see and record images and video. Eventually, however, they could focus on their own and automatically adjust for balance and white level. Then they could start making out faces. They could tell the humans from the backgrounds and make sure we got the focus. Later, thanks to Apple taking ownership of their own chipsets, image signal processors (ISP) could not only better balance, expose, and focus on images, but could detect multiple faces, merge multiple images to provide higher dynamic range (HDR), dynamic exposure, and eliminate instability and motion blur in both the capture and the scene. The software allowed the hardware to do far more than optics alone could account for. Moreover, the camera gained the ability to scan products and check us out at Apple Stores, to overlay augmented reality to tell us about the world we were seeing.

Microsoft, for their part, is already on their second generation visual sensor, the Kinect. They're using it not only to read a person's movement around them, but to identify people, to try and read their emotional state and some amount of their biometrics. Google has experimented with facial recognition-based device unlock in the past, and Samsung with things like pausing video and scrolling listviews based on eye tracking.

Apple has now bought PrimeSense, the company behind the Xbox 360's original Kinect sensor, though their plans for the technology have not yet been revealed. The idea of "always watching" is as controversial, if not more so, than "always listening" and comes with the same type of privacy concerns. But what Siri did for the iPhones "ears", these kinds of technology could do for its "eyes", giving them a level of understanding that enables even better photography, security, and more.

Touch ID

Touch ID, Apple's fingerprint identity system, is already doing that. It's taken the Home button from a dumb switch to a smart sensor. Instead of verifying people based on a Passcode they know, it identifies people based on who we are. Apple has hired other biometric sensor experts as well, though they haven't yet announced just what exactly they're working on. Yet the idea of devices even more personal than phones, wearable, capable of tracking not only fitness but health, is compelling. Pair them with the concept of Trusted Bluetooth — something you have — verify identity, and one day what knows people on a biological level could be used to unlock the technological world around them.

That's the next great frontier. Phones now know and understand more than ever about their own place in the world, and their owners', but the world itself remains largely empty and unknowable. iBeacons, also introduced in 2013, could help change that. While satellites orbit the earth, cell towers dot the landscape, and Wi-Fi routers speckle homes, and schools, and business, iBeacons are meant to fill in all the spaces in between, and to provide information beyond just location. Connected via Bluetooth 4.0 Low Energy, it could eventually guide navigation everywhere from inside stores, schools, and buildings, to vast wilderness expanses. iBeacons promise a network as rich as the world around it.

Thanks to the "internet of things" where every device with a radio can also be an iBeacon - including our phones and wearables - and can also sense and transmit it's understanding and capabilities, everything could eventually tie together. Nest already makes connected thermostats and smoke detectors. Nexia already makes connected door locks and security systems. Almost all car companies offer connected automotive options. Eventually, everything that controls environment or will understand that environment and be able to hand over that understanding and control. Apple doesn't need to make most or any of this stuff, it just needs to be the most human, most delightful way of connecting it all together.

CarPlay is an example. The opposite of a wearable is a projectable. Apple did that early on with AirPlay and the Apple TV. They don't have to make a television, they can simply take over the screen. They don't have to make a car, they can simply take of the infotainment system. How many screens will one day be in our lives? Imagine iOS understanding most or all of them and presenting an Apple-class interface, suitable to the context, that's updated whenever iOS is updated and gets more powerful and capable whenever iOS devices get refreshed. It might take the development of dynamic affordance and push interface and other, more malleable concepts, but one day the phone in our pocket, the device we already know how to use and that already knows us, could simply exist on everything we need to interact with, consistent and compelling.

iOS 7 preview: iOS in the Car

It won't be The Terminator or the Matrix. These won't be AI out to destroy us. This will be Star Trek or JARVIS from Iron Man, This will be devices that are capable only of helping us.

Traffic will take a turn for the worse. We'll glance at our wrist, noting we need to leave for our appointment a few minutes earlier. The heat at home will lower. Our car will start. It'll be brand new, but since our environment is in the cloud and our interface projects from the phone, we'll barely notice the seats moving and heating, and the display re-arranging as we get in. The podcast we were listening to in the living room will transfer to the car stereo even as the map appears on screen to show us the way. The garage door will open. We'll hit the road. The same delay that made us leave early means we'll be late for our next meeting. Our schedule will flow and change. Notifications will be sent out to whomever and whatever needs them.

We'll arrive at the building and the gate will detect us, know our appointment, and open. We'll be guided to the next available visitor's parking spot. We'll smile and be recognized and let in, and unobtrusively led to exactly the right office inside. We'll shake hands as the coffee press comes back up, our preference known to the app in our pocket and our beverage steaming and ready. We'll sit down, the phone in our pocket knowing it's us, telling the tablet in front of us to unlock, allowing it to access our preferences from the cloud, to recreate our working environment.

Meeting done, we'll chit chat and exchange video recommendations, our homes downloading them even as we express our interest. We'll say goodbye even as our car in the parking lot, turns on and begins warming up, podcast ready to resume just as soon as we're guided back to it, and into listening range. On the way down, we'll glance at our wrist again, note we need to eat something sweet to keep our energy in balance. A vending machine on the way will be pointed out, the phone in our pocket will authorize a virtual transaction. A power bar will be extended out towards us. We'll grab it, hurry down, and be on our way to work even as our 4K displays in the office power up, the nightly builds begin to populate the screen, and the tea machine starts brewing a cup, just-in-time, ready and waiting for us...

Right now our phones, tablets, and other mobile devices are still struggling as they awaken and drag themselves towards awareness. It's a slow process, partially because we're pushing the limits of technology, but also because we're pushing the limits of comfort as well. Security will be important, privacy will be important, humanity will be important. If Apple has had one single, relentless purpose over the years, however, it's been to make technology ever more personal, ever more accessible, and ever more human. Keyboards and mice and multitouch displays are the most visible examples of that purpose. Yet all of those require us to do — to go to the machine and push it around. Parallel to input method there's been a second, quieter revolution going on, one that feels and hears and sees and ultimately not only senses the world and network around it, but is sensed by it. It's a future that will not only follow us, but, in its own way, understand.

Update: CarPlay, just announced, has been added to this article.

Rene Ritchie

Editor-in-Chief of iMore, co-host of Iterate, Debug, Review, Vector, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts

 

33
loading...
0
loading...
139
loading...
0
loading...

← Previously

How to find web pages, bookmarks, history, and 'on this page' with smart search in iOS 7 Safari

Next up →

Darklings for iOS: Beautiful graphics combined with easy, yet addictive gameplay for all ages

Reader comments

The contextual awakening: How sensors are making mobile truly brilliant

23 Comments

[Continuing to imagine]
- Traffic relay nodes to help auto-adjust cruise control speeds when traveling.
(I can just hear Siri telling me I should slow down now. )
- Provision awareness to indicate you should get dog food while traveling home.
- Detecting injured iOS device users and and assessing the situation for emergency assistance and notifying family or loved ones.
- Auto capturing and broadcasting to your Apple ID account the image and location of a thief who's attempting to access a stolen iPhone/iPad.
"Oh, the possibilities."

Sent from the iMore App

I went to Microsoft Build over the summer and one of their researchers did a presentation on ubiquitous computing. He specifically mentioned the iPhone and called it a "miracle" of modern day computing. Thought it was very interesting. I can't add a link directly in this comment but if you Google search "Designing for Ubiquitous Computing", the first result will contain the video presentation as well as the PowerPoint slides.

I forgot to ask... It looks like you're entering a spaceship... Is this an actual place or is it composite wizardry?

Sent from the iMore App

Nice write up Rene. Lots of exciting possibilities coming. I'm just now researching home automation products. Prices are coming down so more people will adopt. Hopefully that leads to a ton of new ideas and new tech.

Sent from the iMore App

I've been thinking about that subject for a couple of weeks now and your prediction makes total sense, besides it seems so easy to be fulfilled, my only regret is that I might not be around long enough to experience it all.

I'm pretty excited about the number and prices of the home automation products that are out now. I think within 10 years, we'll be living in a science-fiction world.

Do you really think the constant miss-directions in Maps is accidental?
Siri has a twisted sense of humor!
xD

Sent from the iMore App

Great read. It's both terrifying and exciting to think about all the possibilities that technology can bring to us in the future.
Terrifying in a sense that little by little, it seems that we're losing something in exchange for these new technologies. Then there's the more common issue of privacy.

Sent from the iMore App

The technology is indeed very cool and potentially very useful but in another generation it makes me wonder how it will be used by noncommercial entities. In other words if were complaining about how much we're being tracked now what will it be like with all of these sensors onboard, with us wherever we go.

okay putting my aluminum foil hat back on

A discussion of iBeacon should also mention Gimbal, which Qualcomm announced a year ago but only has started to roll out a last week. The beacon hardware is cheaper, and, to the end-user iOS app, it should perform indistinguishably from an iBeacon, and theoretically will be cross-platform as well. (iBeacon would definitely have the advantage that the iOS device itself could be a transmitting beacon.)

I don't see a single thing here that is worth suffering even a handful of the obvious downsides.

This reminds me of how in 1999 we were saying that by 2005 there would be no more grocery stores because everybody would be ordering their groceries over the Internet. I mean — duh! Groceries! So simple and easy for genius programmers to conquer with technology.

There are just so many really bad things that can happen to you when your life is being recorded like you suggest, and you're doing it so you don't have to leave your desktop computer powered on all the time, or wait 8 seconds while a video spools up for immediate streaming?

Far from just being the beginning of sensors and tracking, I think we are right before the crash and the backlash already. The NSA revelations alone are enough to make any idea of a benevolent sensor future absurd. We don't have reliable encryption right now. The amount that we've already built out this digital/network/sensor society is already too much. The encryption foundation turns out to be rotten and made of sand. We're not going to go from brick houses to skyscrapers on that foundation.

And nerds have to understand, when my girlfriend upgraded from an iPhone 3G to an iPhone 4S and discovered her 3rd party apps were now doing things in the background, she was HORRIFIED. She wanted to know where she can turn that feature off. The feature of course was Multitasking, which nerds had been demanding at that point for like 3 years, like they were ready to storm Apple's gates if they did not get it. Did any nerds ever stop to ask if maybe some users don't want apps doing things behind their backs? No. Because nerds have a complete and total disregard and even disdain for people who aren't nerds. All nerds want from the rest of us is to subsidize their chosen computer platform with our numbers. That is why it is so disturbing that even Apple has fallen into the trap of nerds building for other nerds, which is why iOS 7 makes the gadget blogs so damned happy and all my friends so damned unhappy. So I understand how great this sensor vision sounds to some people. But please, understand how much, if not most of the world does not want this vision.

Some parts of the universal sensor experience that I think you left out:

“About once a month, you are arrested because suspicious patterns of movement have been identified in your sensor data by law enforcement algorithms. You remember reading about the first time that happened to someone, back in 2009. You are held each time for 8 hours and then released. You have been raped twice while in lockup, and beaten up 9 times. The most-recent time was the worst, because the pharmacy refused to fill your pain medication prescription because after 9 brutal beatings you have taken so much pain medication that your biometric sensor data showed you were in danger of becoming addicted.

“Your tax burden goes up 25% per year as governments find new ways to tax you based on what is revealed in your sensor data. You receive $1000 in traffic fines every month for things like 65 incidents of wavering out of your lane, 25 incidents of driving more than 5 kph below the speed limit, 18 incidents of driving more than 10 kph below the speed limit, and 11 incidents of parking at an imperfect angle. You have also been fined repeatedly for eating a diet that is too high in sugar, as well as for not taking your cholesterol medication. These fines are conveniently taken out of your paycheck before it even reaches your bank.

“About 25% of your city is now off limits to you, because law enforcement is notified when someone in your income bracket or lower enters those neighborhoods and you are pulled over immediately and your car searched and then you are turned back or arrested. Your vote hasn't been counted in the last 2 elections because a mysterious bug in the voting machine's biometric sensors cause it to fail to register the votes of people with faces that show any trace of African heritage. A similarly mysterious bug has also made it impossible for you to enter certain malls and retail stores, where the security cameras are now flagging all non-European faces as shoplifting suspects.

“You have been mugged 14 times by thieves who identified you as a juicy target by hacking the sensors in the devices you were carrying. But that wasn't nearly as bad as the 3 times you've had to change your name due to identity theft, losing all of your property each time as well.

“You always have a feeling of panic and despair no matter what you are doing. You never stop dreaming about an island somewhere and building your own house out of logs.”