Tonchidot: Visual Tagging for the iPhone

Real working app? Proof of concept? Science fiction? All of the above? Don't care. Want it now. This amazing demo shows an App that uses the iPhone's camera to record and display visual tagging information in real time, right in front of you. Drool.

Want to see more? Check out their long (18 minute) presentation from Techcrunch 50. Amazing stuff.

(Via TWiT)

Have something to say about this story? Leave a comment! Need help with something else? Ask in our forums!

Rene Ritchie

EiC of iMore, EP of Mobile Nations, Apple analyst, co-host of Debug, Iterate, Vector, Review, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts



← Previously

Google Location for iPhone: Smaller is Now Better!

Next up →

$250K iPhone Dev Says "No Thanks" to Google Android

Reader comments

Tonchidot: Visual Tagging for the iPhone


this cant be real. i mean like i really want it to be but it just cant be real. and even if it is i think it wont come out until like 10 years lol. someone start a blog. lets do this shit.

Are you sure Apple will allow it??? Doesn't it duplicate the super duper built in iPhone cameras functionality???? Oh wait a minute it totally blows away what the iPhones camera can do! Still doesn't mean Apple will allow it haha It might confuse their users. Sorry this rant is due to Jobsy dissing developers as of late.
On the serious side an app like this would ROCK!!!

this is ridiculous who would be providing all the data necessary for this visual tagging, much less the processor speed necessary

Cool in theory but I imagine 75% of the tags would be advertisements, 20% crap tags from idiots and 5% useful tags.
Google will probably acquire this company and any IP it may have within a few months. Probably release it for WinMo, iPhone, Android, Symbian.
After release, advertisement companies will make a land grab and start sending crews out to virtual tag the city for ad purchasers.
BTW- Can you imagine all the people bumping into each other, cars, and lamp posts because they were looking up into their phones?

Or it could be another viral marketing campaign for the next J.J. Abrams project.
Watch out for Tonchidot on Fringe or Lost...

Ok so heres the deal. I watched the 18 minute version and the concept seems very cool, but not very practical.
The developers could not answer one practical question about the camera and somehow because they couldn't speak English very well, it was very funny to the audience and not that big of a deal. Well, it kind of is a big deal. One question brought up was if a phone store moves the phones to a different part of the store, what happens to the tags? The developers said that the tags were location based, not done through the camera. They coudn't answer that question at all. They obviously haven't thought through a lot of problems.
Secondly, we all know the gps in the iphone isn't that great. A lot of times, i'll have the actual gps running, and ill be on the wrong street, or across the highway. If this whole concept is based on location, how can you possibly distinguish tagging one item in a store from another? I don't know... cool concept... hope it works... but seems like they've got a lot of work ahead.