Apple has acquired PrimeSense, a leading developer of 3D sensors that's best known for their role for creating the technology that powers the Kinect interface for Microsoft's Xbox 360 video game console. Does this mean that Macs or iPhones will be getting their own Kinects? Unlikely. But PrimeSense's acquisition may give us some insight to the future of gestures on the Mac and iOS platforms.
First of all, let's understand what Apple just bought for what's speculated to be up to about $360 million.
PrimeSense is a semiconductor company. They've developed 3D sensor technology; then they put that technology on silicon, so it can be incorporated into other products. PrimeSense in particular is a fabless semiconductor maker - that is, they outsource the actual manufacturing of their chip designs to semiconductor foundries, much like Apple does with the A7 processor used in the new iPads and iPhone 5S.
PrimeSense's technology enables computers to look at a scene in three dimensions. What makes it unique is PrimeSense's patented "Light Coding" technology, which uses a combination of near-infrared sensing and off-the-shelf image sensors like you what you find in a digital camera. PrimeSense's technology can differentiate the dimensions of a room from the location of a desk, and the movements of people within the environment.
You can see why this is perfect fodder for a video game system - if you've never used Kinect, the games made to work with it treat your body like a giant controller. Dance games track your movements to make sure you're in time with the song and matching the steps, for example.
While video games are the most broadly-used market for PrimeSense's products, they're not the only one. PrimeSense's products also have applications in healthcare: doctors and nurses can use it for patient monitoring, making sure that a patient isn't in distress. Touchless displays allow for data input without ever touching a physical surface like a keyboard or glass screen - more hygenic. The same use makes PrimeSense a great option for the classroom and boardroom, too, where teachers or managers might be using whiteboards.
PrimeSense has already sold millions of these systems to Microsoft, but their next generation technology is very promising in a different direction, including an embedded reference design created for use in tablets, TVs and PCs. They've already shown it working in an Android tablet. It's this new technology, and whatever else PrimeSense has cooking, that has me the most interested in what Apple might do.
Touch-driven interfaces like iOS, Android and Windows Phone 8 have their practical limits in terms of usability. Hybrid PC/laptops haven't exactly set the world on fire - why dirty up your laptop screen with finger smudges?
Imagine, then, Mac laptops that allow you to gesture in the space above your keyboard instead of using the trackpad. Data visualization, 3D modeling, and other tasks we use our computers for could be very different. So could the essential desktop user interface metaphor. If it's a virtual desktop.
If this all sounds too "Minority Report" to be like real life, you're wrong. You can already do this on a Mac or PC using the $79.99 Leap Motion Controller. More than 100 apps have already been adapted to or written to work with the Leap Motion Controller already.
This is a three inch long box that sits in front of your keyboard. It'd be even better if it was built in to the bezel of your MacBook's screen. Or in the keyboard of the iMac or Mac mini you're using.
Not to mention the inevitable flood of apps that would follow, through the respective platforms' App Stores.
But the killer app, as far as I'm concerned, is the application for this technology in the Apple TV. Again, going back to how the Kinect works with Microsoft game consoles, navigating the menus of your home entertainment system using touchless technology is certainly a lot better than fumbling for the tiny remote control that ships with the thing.
Gesture and movement-based games on the Mac aren't a new thing. iSight cameras have been available for the Mac for a decade now. Shortly after they were introduced a company called Strange Flavour published a game called ToySight that presaged what Xbox 360 users would be doing with their Kinect years later - you could play games, interacting with objects on your Mac's screen using a mirror image of yourself.
Those first iSight cameras were bulky, barrel-shaped devices that you'd affix to the top of your Mac's screen using sticky tape or clamps. Now iSight cameras are standard issue on most Apple devices (the Mac mini and the forthcoming Mac Pro are the two lone holdouts, since they don't have built-in screens), and they're so tiny and integrated that they're easy to miss unless you know what to look for. PrimeSense's next generation technology could also be integrated unobtrusively.
iSight cameras could become that much smarter through the integration of PrimeSense technology - the Mac or iOS device could have a better understanding of what you looking like and how you're moving in your environment. So there are practical applications for this that have nothing to do with games or modeling software - PrimeSense technology could just make Mac and iOS device use better, too.
If Apple's past acquisitions are any indication, it'll be some time before we see any practical applications of PrimeSense's technology hit the Mac or iOS platforms. In the interim, we can fantasize and speculate a bit about what our favorite applications might be.
So fire away in the comments - I'd like to hear how you imagine PrimeSense tech might work on Apple products in the future.