Learn how to implement the first two elements of an Augmented Reality engine (the camera and the Compass) on Android.

WEBINAR:

On-Demand

ugmented Reality (AR) seems to be on everyone's radar. For the two of you who haven't watched a YouTube video about it: Augmented Reality is the ability to overlay location data points on the live view of a mobile device's camera. In a sense, AR allows the phone to become a window into a slightly different, data-driven world.

To me, however, AR appears to be little more than a gimmick at this point. Before you fill my inbox with hate mail, let me quickly explain why:

The sensors involved (compass, accelerometer, and GPS) aren't nearly advanced enough to do the kind of real-time tracking required for a useable AR app.

No currently available data set performs best in this medium. That is to say, any data set you can show in AR today would look better in something like a Google Maps view.

However, any good programmer knows that the best time to enter a market is before a killer application makes its debut, rather than when the market is already proven. With that in mind, DevX will publish a two-article series to set you on the path to building your own Augmented Reality engine on Android. Building an AR application requires a dash of math and four technological pieces. This first article covers the first two pieces: the camera and the compass; the next article will cover the other two: the accelerometer and GPS.