Person-centric computing: The future beyond iOS and OS X

Person-centric computing: The future beyond iOS, OS X, and

This is the dream: You pick up a phone or a tablet, you sit down to a laptop or desktop, you walk up to a display of any kind, and all of your stuff is just there, ready and waiting for you to use or enjoy. It's the future of decoupled computing where intelligence is independent of environment, and where the device-centric world gives way to the person-centric experience. It's a future where iOS or OS X, cloud or client are abstract terms no mainstream human ever has to deal with or worry about. It's getting a lot of attention lately thanks to comments on convergence from Apple executives in an interview with Jason Snell on Macworld and the catchy "iAnywhere" label of a couple of analysts. What's more interesting to me is not so much the idea — computing as we know it will obviously continue to evolve — but the implementation. How could Apple make the person-centric experience a reality?

Here's what the analysts said about iAnywhere:

This "next big thing" for Apple would essentially be a "converged platform" featuring the company's Mac OS and iOS operating systems in which an iPhone or an iPad would be able to "dock into a specially configured display to run as a computer."

It's far from an original idea, of course. Apple itself filed patents for roughly similar technology — a portable docking to make a desktop — going back at least as far as 2008:

A docking station is disclosed. The docking station includes a display and a housing configured to hold the display in a manner that exposes a viewing surface of the display to view. The housing defines a docking area configured to receive a portable computer; The docking area is at least partly obscured by the display when viewed from the viewing surface side of the display at an angle substantially orthogonal to the viewing surface.

OS X merging with iOS has likewise been a popular discussion topic since OS X Lion brought iPad ideas "back to the Mac". It made sense at the most superficial design and naming level for comfort and consistency, but never at the deeper levels, and certainly not for anything approaching convergence. It's the "toaster-fridge" Tim Cook made fun of. Or, as Apple's SVP of software, Craigh Federighi said to Snell:

"The reason OS X has a different interface than iOS isn't because one came after the other or because this one's old and this one's new," Federighi said. Instead, it's because using a mouse and keyboard just isn't the same as tapping with your finger. "This device," Federighi said, pointing at a MacBook Air screen, "has been honed over 30 years to be optimal" for keyboards and mice. Schiller and Federighi both made clear that Apple believes that competitors who try to attach a touchscreen to a PC or a clamshell keyboard onto a tablet are barking up the wrong tree.

"It's obvious and easy enough to slap a touchscreen on a piece of hardware, but is that a good experience?" Federighi said. "We believe, no." [...] "To say [OS X and iOS] should be the same, independent of their purpose? Let's just converge, for the sake of convergence? [It's] absolutely a nongoal."

The important thing here is to remember that the Mac didn't always run OS X, and in another 30 years — hell, another 10 or even less — it might not run OS X anymore either, nor might iOS as it currently exists be found on any hardware. That's why what's really being said by Federighi is that traditional WIMP (windows, mouse, pointer) interface shouldn't be jumbled together with multitouch. They're different paradigms for different hardware.

What's not being said is whether one will eventually supersede the other, the way punch-cards and, to a large extent, command lines, have been superseded by the GUI and multitouch, or whether they'll all be replaced by something else entirely, like natural language or the cloud. Examples of all of them have either already been tried, are in progress, or have been talked about for years.

Layering interfaces together

Microsoft tried to layer multitouch and desktop interfaces into Windows 8. It was pitched as "no compromises" but has generally been rejected by the market as the ultimate in compromise. Instead of the best of both it ended up the worst of both. It may not have been obvious from the start. Many people still ask for touch screen Macs or iOS X on large-size tablets, for example. Hell, I used to think I wanted an iOS layer on top of OS X, the way Front Row or Dashboard works, or even LaunchPad. Not any more, and it's likely no coincidence that Front Row is now gone, Dashboard is basically abandonware, and only LaunchPad remains. That's because it was obvious to Apple, and hopefully to everyone in hindsight — as much as the medium is the message, the interface is the experience. Bifurcate one and you bifurcate them both.

Although multitouch and mobile haven't layered well together, natural language is currently working in just exactly that way. Siri sits "on top" of iOS and Google Now sits "next to" Android. Both are only a press or a swipe or an utterance away. That makes a lot more sense. Neither can replace the multitouch interface right now, but both can enhance the experience.

Taking it to the cloud

Google's ChromeOS is the best real-world implementation of the thin-client model we've seen so far. All your computing is all in the cloud. You log into a device, conceivably any device, and it becomes yours. Google's only doing it on laptops and desktops for now, but it's not hard to imagine tablets and even phones and wearables can and will follow. I don't think the Google Now card interface was chosen accidentally or haphazardly, nor do I think Google Services have been abstracted from Android's core solely to shape the current market.

Yet the pure cloud approach has many limitations as well. Connectivity remains an issue, as does the limitations on software power and performance. Concerns about privacy and security remain as well. For many people, for many activities, however, convenience could well win out.

Apple's iCloud, by contrast, stores and syncs account information between both iOS and OS X, but only backs up iOS settings and data, and needs apps and media to be re-downloaded to local devices. It's a cloud solution from a device company, and while it can and will improve, the idea of everything in the cloud may not fit Apple's vision for computing.

Making the brain mobile

BlackBerry CEO supposedly says dumb things about tablets, media misses bigger discussion about the future of computing

Last April former BlackBerry CEO Thorsten Heins caused some controversy by saying tablets weren't a long term business. He was ridiculed because, well, iPad. His vision, however, was the internet of things, where a phone could be the brain that powers multiple different computing end points, including tablets and traditional computing displays (with mouse and keyboard). Here's what I wrote about the idea back then:

The futurist in me wants to take that a step further, to where the computing is decoupled from device, and the "brains" are a constant thing we always have with us, hooked in everywhere, capable of being expressed as a phone or tablet or laptop or desktop or holodeck for that matter. All my stuff, existing everywhere, accessible everywhere, through any hardware interface available.

It's not dissimilar to the vision of computing shown off over a decade ago by Bill Gates during one of his impossibly forward-thinking CES keynotes. Unfortunately, just like with tablets, Microsoft couldn't get past their Windows-everywhere, PC-centric point of view, and so a decade later they're no further ahead than anyone else when it comes to actually delivering it.

What's interesting about the device-as-the-brain, however, at least compared to cloud-as-the-brain, is that it's not entirely reliant on someone else's server. Because it doesn't have to depend on the internet, it can sit in your pocket and make its own ad-hoc, direct networking connections as well.

Palm's Folio laptop companion died on the vine, and the original purpose of the BlackBerry PlayBook was never allowed to be. Both of those things were for good reason.They simply weren't the right implementations and they absolutely weren't at the right times.

The sum of the parts

What's also possible, perhaps even more likely, is some combination of all of the above. A device with both multitouch and natural language and sensor-driven interface layers, that connect to the cloud for information and backup, but also serves as the central point for identity and authentication.

I've mentioned this kind of person-centric future briefly in a previous article, The contextual awakening: How sensors are making mobile truly brilliant:

We'll sit down, the phone in our pocket knowing it's us, telling the tablet in front of us to unlock, allowing it to access our preferences from the cloud, to recreate our working environment.

That, I think, is key to this working out for me. Rather than docking, projecting device trust and even interface is more forward-thinking. Understanding the context of not only us, but the screens around us, simply makes the most sense. Rather than old-style UNIX user accounts on iPads, for example, Touch ID, iCloud, and device trust could do the same thing. Projecting from phone to wearable or phone to tablet or computer could do even more.

Some of the technology for this already looks to be underway. From last year on the concept of "iOS everywhere" branching off from iOS in the Car:

Traditionally Apple hasn't done as well when they have to depend on other companies, but the potential of iOS in the Car seems to go further than just the car. Indeed, it could provide our first hints of iOS everywhere, and that's incredibly exciting for 2014, and beyond. [...] And, of course, seeing Apple project iOS interface beyond just TV sets and Cars, but onto all manner of devices would be fantastic as well. Apple doesn't make the range of products a Samsung or LG make, nor do they have any interest in licensing their operating systems the way Microsoft, BlackBerry, and Google do. However, taking over screens neatly sidesteps both those issues, and keeps Apple in control of the experience, which they're fond of. So we'll see.

Imagine it: You walk into your house with your iPhone, your lights come on and so does your TV and home theater, exactly where you left off. You iPad and/or your iMac also turn on, your current activity locked and loaded. And all of it, not because of a fiddly dock or inconvenient login, but because they know you, they know your stuff, and they know what you want to do with them next.

Beyond iOS or OS X, beyond "iAnywhere", that's the dream many of us have been waiting for.

Rene Ritchie

Editor-in-Chief of iMore, co-host of Iterate, Debug, Review, Vector, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts

 

-
loading...
-
loading...
-
loading...
-
loading...

← Previously

How to reset Launchpad in Mavericks

Next up →

Smartphones and Sex: How the tech in your pocket affects life in (and out of) your pants

Reader comments

Person-centric computing: The future beyond iOS and OS X

27 Comments

That's why I prefer the idea of person-centric computing, where your identity tracks between machines and interfaces, rather than glomming those machines and interfaces together.

Agreed, but Apple is in the business of selling hardware and person-centric computing is not something that fits their current priorities.

Interesting enough what you are talking about is what Scott McNeely was pushing when Java first came around.

Yup, and later Ellison as well. I think Apple could and would do it, because the way I envision it it would play to their strengths. Great hardware, great ecosystem, with authentication and identity pushing it all together, creating both better value and experience.

i totally agree with rene's vision, about the strength a great ecosystem could play for a strategy i call iCompute rather than iAnywhere. iCompute weaves the devices i own into a network of authorization which isn't dependend on cloud services which i don't own. But iCompute opens up ios and osx to a point where they could participate from each others features. think using the iphones gyroscope for input of osx.

I agree. In fact, I'd posit that Apple may be the only company currently capable to do it (short of multiple companies working together), but unfortunately they probably won't as it conflicts with hardware profit.

What if it could enhance hardware profit? You'd need Apple versions of anything they make (iPhone as brain, iPad, iMac as projection options), and they could make deals like they did for iOS in the Car for anything they don't make?

iOS in the Car is vaporware at the moment just as hands free Siri is DoA (people forget that had the same hoopla). That alone tells me that Apple isn't really interested in pushing such a idea. Hell, I'd be happy with something small like multiple users per iOS device. Another indication (to me) was Apples complete disinterest in the acquisition of Nest as there are strong hints that Fadell proposed a buyout from Apple and Apple said no. Nest seemed to be the perfect gateway to home integration for Apple.

That said, I like this train of thought. I imagine someone at Apple is at least thinking of stuff like this and doing prototyping. Oddly enough, Samsung would be a good partner for this as they make a ton of home appliances/electronics.

Apple buying Nest makes zero sense. They focus on very specific things and add products very slowly, typically by annexing whatever's next to the last product they released.

Apple won't be making thermostats any time soon.

Also, Fadell doesn't really have a place in the modern Apple. Google needs him more. Apple needs whatever the server-side equivalent is of a Fadell...

And iOS in the Car is a proof of concept for bi-directional AirPlay. Tons of possibilities in that, and in areas Apple does focus on...

"iAnywhere equates to bad times for iOS and Apple."

Not if Apple can monetize that iAnywhere-ness. They're already ramping up their content + services revenue to replace lost revenue from declining hardware pricing in the future, and putting all the pieces in place for new revenue streams (retail, television, etc.)

"... projecting device trust and even interface seems more forward thinking. "

Yes, but only for the next 10 years or so. Beyond that we'll see less and less chrome. Displays will be cheap and generic, especially pocket-sized displays. But they won't be used for input much. Your main device's CPU will be just an earpiece, it will be constantly connected to the cloud, and it will use whatever display is in front of you. Voice will become more pervasive in all aspects of computing.

In other words, pretty much like in "Her."

Unless some form of local priority has value, especially security and privacy value. That becomes your token/"something you have", and the hub of the projected experience. If Google has the trust store on the cloud, but Apple keeps it local in your hand, that could be a differentiator that matters to people?

that's exactly the point: most people i speak to about the future of computing don't really trust the cloud - they wanna keep a feeling they own it, as it's local to their hand, home etc.

I, too, feel the same. Especially since there are many moments when network accessibility is impaired. I want all my stuff in my pocket.

Sent from the iMore App

There's an app for android called "Tasker" that allows for many of these functions, some in a limited way while others more extensive.

Biggest drawback is the crazy amount of setting up you have to do to get it all functional.

Sent from the iMore App

Great article and forward thinking Mr. Ritchie!

I don't see nearly as many holes in this argument as I do with iAnywhere, Windows Everywhere... or even Google's services and cloud everywhere approach. Apple will do just fine on the hardware front as you suggest above, if not only for security purposes. I think individual optimized apps, on whichever screen/device the iPort is in vicinity to connect securely to, will be with us for some time to come.

IMHO, Apple knew at the patent date, and already was planning for the day in the near future that sensors, chips, and ubiquitous wireless connections would be the norm sooner than later. They took a lot of flak for not including a universal USB or SD card slot for moving data across devices. The haters chalked it up to Apple protecting it's store. I always maintained that those connections would not be needed in the near future, so why include them, just to enter a PR-S***storm when you drop the connects at a later date... that was not far off.

They didn't include USB or SD (other than the adapters) for the same reason they didn't include floppy disks on the iMac. The past is a retardant to the future. :)

Unfortunately we will continue to use USB & SD in the future to come and people will use DVDs a lot longer. Just as people use (Apple) stores to purchase products instead of ordering everything from the internet. Floppy disk was replaced because there existed a different item that we could use instead, Apple replaced USB/HDMI/SD at the peek of their popularity with what exactly ? Yes, I still want those in my tablet, life is easier with those and I like to have options. Cloud is fun, switch off the web connection and your device becomes a brick.

As someone who works in IT what I want to see is the hardware become irrelevant. When computers break down for me the first thing is to find a temporary computer so the user isn't sitting around. Productivity is the first rule and that means my role is to keep the users productive. The problem in grabbing another computer is the user's settings aren't there, applications have to be loaded, data may have to be migrated from the ailing computer if possible. So I spend half a day just getting a temporary computer ready. Microsoft has an interesting idea where the computer is in a USB stick. So the hardware becomes irrelevant because you plug the USB into the computer and it becomes "your" computer. It's sort of the thin client without the need for connectivity. It's not the best implimentation and like tablets in a few years someone like Apple or Google might make the idea actually work but for me that's what I want.

The USB stick is interesting but likely just as easy to lose or error-out. Google's cloud store, if you trust it, is likely more reliable.

The idea of a device-brain is also subject to loss or data error, but can also be backed up to the cloud for restoration.

Interesting times!

I'm not sure the Cloud is the answer. I think Apple is better to envision a world where devices communicate with each other. Google is far more cloud-centric but it needs to be. Google is concerned with what you're doing so that Google can mine you for data. Apple has always been about your experience. It makes more sense that if you interact with devices that these devices interact as well so that they can share data for you, not the Cloud. I'd prefer a world where the Cloud is a device and not the "brains." It's safer and gives me more control.

Really nice read Rene. I couldn't agree more with many of the points you've raised. I prefer to have all of my stuff on my singular go-to device that enables me to access any of my stuff at any time w/out regard to network accessibility.

Sent from the iMore App

Nice article, if I may add to the conversation one idea. That is that the hardware/software seems to be there already. Ibeacons is being used in MLB ballparks as we speak. Location based handshakes and other login info could be easily (mainly because I don't have to do it) integrated in a smerge of iOS and OSX. By location it could activate the ??? file/website/game/etc on the device being used to its full potential (limitedly as for iOS, but full on for OSX) and let the iOS and OSX do their things independently. Pages on iOS is the limited version and full set on OSX. I do this all the while in Keynote.

I think that Microsoft has almost nailed it with Windows 8.1, of course it was rough around the edges (storage problems, apps etc.) and everything wasn't as smooth as it should have been (how much time did the initial setup take of the Surface out of the box ?) but if it was something made by Apple it would have been embraced as the only sensible way. Apple would ridicule any other approach if Surface was their device, touch interface that fully suits your needs on top and the familiar classic desktop system in the background (which you can pretend isn't there if you don't want to use it). I hate the desktop UI but unfortunately it's still a necessary evil to manage your files while the constant switching of apps (iOS style) to check/copy/move something is no way at all.

I think this idea of "iPhone as Brain, tablet or laptop as destination for projecting interface" doesn't go far enough. Why does the "Brain" part have to be a device with an interface, and what makes the "smallest screen" (the iPhone) distinct from the larger screens (iPad, MacBook, TV)?

Applications are already designed using the principle of Model-View-Controller. If applications could be designed with a single piece of Model code (the "Business logic" bit) that ran on the Brain device, and then multiple decoupled View-Controller code modules, customised for the UI in question (mouse/keyboard, multitouch, voice, whatever), that are either run on the Brain and projected to the Interface device, or run on the Interface device directly. Your "phone" would then be just another multitouch display Interface device.

The Brain could be the ultimate "wearable", a piece of clothing you have against your skin, that harvests energy from you, and has sensors monitoring you and checking that you are you (a biological version of Windows XPs Product Activation). Trust and authentication sharing could use your skins conductivity to transmit the necessary handshake when you first touch an Interface device, before handing off to WiFi (this exists already, see 802.15.6). In effect, your hands become your keys, the most powerful NFC chip ever.

Sent from the iMore App