What is Augmented Reality on mobile devices?

I wrote this as a companion piece for an article that may show up in the next issue of iphoneLife, but I thought I would include it here and now since AR is getting more relevant.  Also, one of the things I said NEEDS to part of the AR toolkit for iphone (mobile) got a write up today in RRW.

This is still in draft form, so I’ll edit it again at some point..

Augmented reality for the iphone

A description/definition of Augmented Reality

According to WikiPedia – Augmented reality (AR) is a term for a live
direct or indirect view of a physical real-world environment whose
elements are merged with-, or augmented by virtual computer-generated
imagery – creating a mixed reality.  The funny thing is, we have had a
simple example of AR on the iphone already. Consider this application
iHUD.  In a way, this could be considered an AR application.  We have
virtual information overlaid on top of reality.  As our position
changes in the real world it is reflected in the virtual information
overlaid. However, for this article, lets refine the definition of AR
just a bit more to require that virtual information must contain a
spatial component that is dependent upon the view in question.  In
other words, as the reality changes based on what we see on the iphone
screen, the information should change accordingly. Now, this makes AR
a much harder challenge, but one that opens up even more interesting
opportunities.

A discussion of the enhancements to OS 3.0 and the 3GS that make it possible.

In order to employ AR, a mobile device must support four key functions.
1) A high enough quality video camera that can detect necessary
details in the scene as well as maintain sufficient refresh to track
and measure movement in the scene.
2) A compass sensor with 1 deg of resolution accuracy or better.
3) Sufficiently high accuracy GPS.
4) Access to geo resolved data accurate enough to align with what the
users sees on the display.
The iPhone is the not the first commercial mobile device to display AR
capabilities, that would go the Android phone and the Nokia N97,
however, both lack the iphone3Gs video capabilities, so even though we
have seen some great apps on these platforms, I’m hopeful for even
more ambitious projects for the iphone. The iphone3G is not as well
suited to AR tasks because of a lack of compass sensor and high enough
resolution camera.
It should be noted that the current crop of AR apps for the iphone are
not fully legal.  The SDK does NOT include the ability to access the
video feed, so the few AR apps that have snuck through Apple’s review
process – I’m looking at you Yelp – have jumped the gun a bit.  But
who can blame them? Its simply too tempting.  However, by the time you
read this article, iphone OS 3.1 should be out which will provide the
proper access via the SDK.
For those who have started to dev products, and those of us who has
tested them, have been hit square in the jaw with an awful truth. If
any of the 4 elements I listed above degrades, so does the experience.
Such that, if the alignment of the virtual content to reality is off
even a small degree, that the AR experience could be worthless, or
worse, provide false information.  This is a concern since I have
experienced my fair share of connection problems and GPS errors as I
travel around San Francisco. As for Yelp, the latitude longitude
database could use some work.
There is a solution however… and its not an easy one.  Analyze the
video that you see through the iphone’s camera for clues as to where
you are. This may
seem like an impossible task, but perhaps not as hard as one might
think. For example; If we know a Starbucks is at a certain street
corner (almost every street corner in some cities) we can verify our
position if the
users puts the Starbucks store sign in the
video camera’s view. OCR (Object Character Recognition) software would
‘read’ the sign and confirm or
correct the known position.  Using video recognition as
a geo spatial input could be the 5th element we need to have robust AR
applications on mobile devices with less then perfect GPS tracking.

A concluding discussion of possible future uses of augmented reality apps.

If we dismiss the limitations of the iPhone in certain locations, and
also assume that some smart programmers begin to employ some forms of
video geo spatial recognition, we could see a bevy of iphone
application beyond anything we has seen so far.
Sure, we will have a plethora of direction applications, subway routes
(as we have seen), landmark identification, and geo caching will be taken to
the next level.  But lets imagine a bit further.
Perhaps you can play cards with your friend on a table without any
cards. Prices and additional information for goods in the supermarket.
Not only can we employ a iHUD like trick to get an overlay on the
windshield, but we could calibrate the display to coordinate to the
road such that we would have directional arrows appearing on the road.
Perhaps the next time you and your friend are separated in a crowd,
you get a real life position tracker to take  you right to them!  Not
sure who is who in a room?  With facial recognition or badge ID
patters, you will get a personal ID popup. Seen the latest trick where
you put a pattern card down and then a 3D image pops up?
(http://www.hitl.washington.edu/artoolkit/) There’s already an app for
that. Walk by a building site and see the proposed new structure as if
it was already built. How about a virtual dog that can detect the
ground and objects, and walks along with you as you walk? As you can
see, only our imaginations will be the limit. And I know that before
the year is over, there will be a number of iPhone apps we just could
not have imagined.

About perivision

All you need to know you can get here or google. http://www.perivision.net