Big Apps Table: iPhone Sees What's Beneath Microsoft's Surface

Scratch Microsoft's Surface and what do you find? More Surface. Or, more accurately, make an x-ray app on the iPhone, place said iPhone on said big a$$ table (YouTube link), and you find some pretty amazing looking technology.

Now the iPhone uses electricity and capacitance for its multi-touch, and the surface uses -- I believe -- infrared video cameras, so the interaction is all the more impressive. How are they exchanging data and coordinating image display, rotation, scaling, etc? Is the edge-detection and outlining beging done on the Surface side and transmitted, or crunched on the iPhone side? And are there any practical uses for this, other than ZOMG! cuil vidz!? (Though that's clearly enough for us!)

(Thanks to Phil from WMExperts for sending this our way!)

Have something to say about this story? Leave a comment! Need help with something else? Ask in our forums!

Rene Ritchie

EiC of iMore, EP of Mobile Nations, Apple analyst, co-host of Debug, Iterate, Vector, Review, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts



← Previously

xGPS Brings Turn-by-Turn Directions to iPhone Via Jailbreak

Next up →

The Great iPhone Sync Debate: Desktop, Laptop, or Cloud?

Reader comments

Big Apps Table: iPhone Sees What's Beneath Microsoft's Surface


Surely this is pointless. If the Surface has edge detection and a display, couldn't you just program it to recognise some transparent frame and project the "x-ray" images inside it? I get the cool factor, but come on.