Photos under glass and the future of design and touch interaction

Former Apple engineer Bret Victor over at Worry Dream has an interesting article up about the future of touch technology. The main focus of the article is how we interact with with devices like the iPhone and iPad using our hands. His main point is that our hands are meant to interact with objects and feel things in a more "tactile" way that simply "photos under glass".

While companies like Apple have made great strides with devices like the iPhone and iPad, he believes that using our physical hands are the future. That's how we should interact. He argues that using your hands to touch and feel things is not the same experience as using devices such as the iPad where everything you are "touching" is under glass. And that the sooner we get away from this mindset and move forward, the better.

This technology is a long way off but given how far Apple has always brought us, I can't help but agree. We're at the very beginning of a really exciting future.

Source: Worry Dream

Allyson Kazmucha

Senior editor for iMore. I can take apart an iPhone in less than 6 minutes. I also like coffee and Harry Potter more than anyone really should.

More Posts

 

-
loading...
-
loading...
-
loading...
-
loading...

← Previously

Adobe confirms Flash Player mobile is dead

Next up →

iPhone 4S has extra infrared sensor for Siri raise-to-speak

Reader comments

Photos under glass and the future of design and touch interaction

3 Comments

Good point. I watched that video a few days ago and thought it was a great glimpse of what things could be, but this article has a very interesting perspective. One I think only someone who had that type of job would really be able to see. One interesting thing I kept thinking while reading this insightful piece, was how the Kinect system that Xbox has, lets you do this in a way. Interesting how Microsoft Office and Microsoft Xbox hadn't formed a symbiotic relationship for this exact purpose. Really, if implemented correctly, Kinect could work for every purpose that a computer is used for: work, play, communication, etc.
 
I work at an architect firm, primarily using AutoCAD and myself and others have often discussed using technology from Minority Report to edit/view CAD drawings of buildings with ones hands. Utilizing Kinect-esque technology, and implementing it the right way, this is doable!

It'll be possible in the far future I'm sure. In fact, computer vision has made some significant breakthroughs recently due to advances in machine learning. However, there are still decades of research to be done before targetless tracking of 3d sparial position is accurate enough to simulate things like picking up objects. This doesn't even consider the fact that optics hasn't moved forward for a few decades. So overall, the feesability of something like minority report happening within the next couple of decades (Read: majority of professional life) is small. Connecting today's infrastructure and mindsets to a vision of the future in small steps is what Apple has learned (the hard way, bc it failed when ut tried to reach too far) and this lesson had made Steve Jobs and Apple successful.