Scratch Microsoft's Surface and what do you find? More Surface. Or, more accurately, make an x-ray app on the iPhone, place said iPhone on said big a$$ table (YouTube link), and you find some pretty amazing looking technology.

Now the iPhone uses electricity and capacitance for its multi-touch, and the surface uses -- I believe -- infrared video cameras, so the interaction is all the more impressive. How are they exchanging data and coordinating image display, rotation, scaling, etc? Is the edge-detection and outlining beging done on the Surface side and transmitted, or crunched on the iPhone side? And are there any practical uses for this, other than ZOMG! cuil vidz!? (Though that's clearly enough for us!)

(Thanks to Phil from WMExperts for sending this our way!)