CES 2013: The Dream of Indoor Navigation is Real

A early demonstration of a technology that could change the way we move through indoor environments.

Certain technologies become so integrated into our lives that we become dependent upon them almost to a fault. Cellphones and GPS have emboldened people to the point where someone can simply walk out the door to meet a friend with nothing more than a rough idea of where they're going. Communication and navigation technology will figure it out for them en route.

We've become so used to GPS navigation that we become frustrated at the point where it abandons us—the moment we walk indoors. The lack of technological options for indoor navigation is particularly apparent at events such as CES, a colossal convention teaming with people and filled with thousands of difficult-to-find booth locations. So CES felt like an entirely appropriate location to test emerging "indoor navigation" technology.

Advertisement - Continue Reading Below

Indoor navigation—or, at least, reliable indoor navigation—is a tough technological nut to crack. Because GPS signals don't penetrate deeply into buildings, it is largely useless indoors. But there are other options. Sensor networks embedded in buildings can communicate with a smartphone to give pedestrian directions inside. In a similar vein, phones can do rough navigation by tracking known Wi-Fi hotspots. But embedding sensors in a building is labor-intensive and expensive, and with the proliferation of mobile hotspots, many indoor spaces are so crowded with Wi-Fi networks that interference is inevitable.

Most Popular

So the most promising technology available is inertial navigation, the same process used for spaceflight guidance. Inertial navigation systems use the MEMS (microelectrictromechanical systems) sensors such as accelerometers, magnetometers, gyroscopes, that are now commonly embedded in smartphones for dead reckoning with a healthy dose of error correction. Inertial navigation is much harder with stumbling, teetering humans than with flying rockets, which tend to travel in a more or less predictable trajectory, but it could be our best hope.

In the Las Vegas Hotel (formerly the Las Vegas Hilton), I got a demonstration of how this could work from Dave Rothenberg of Movea. Movea is what's known in the technology business as a "technology solutions provider," meaning they don't really make the underlying MEMS hardware, nor do they plan to create the finished consumer product, but rather create the software and do the integration work that makes future products possible. Companies such as Movea are a good place to see future technologies in a nascent stage.

Rotherberg's demo was simple in premise: He was going to use an ordinary Samsung Galaxy III smartphone to guide us from the lobby of the hotel to the sixth floor, with all the maps, arrows, and turn-by-turn directions you'd expect from a modern navigation system. Sounds easy, but the phone would be doing the guidance with no GPS and very little in the way of exterior frame of reference. It's the technological equivalent of trying to walk down a hallway after closing your eyes—your first step is confident, your second step less so, your third step even less so.

Rothenberg spent plenty of time before our walk explaining everything that could go wrong, and indeed, some initial glitches forced us to restart the demo once. The biggest problem is drift. You can pre-program the guidance application with information about your stride length—an estimate it makes based on your height, weight, and age—but because the system is tracking your location based on assumptions about your average movements, each step makes it a little more likely to get you off track (nobody takes a perfectly average step every time).

Still, in this highly controlled demo, the system was surprisingly accurate and useful. Rothenberg tells me that Movea's guidance is helped tremendously by a technique known as map-matching. They have integrated a floor plan of the hotel integrating into the software, and the application is programmed to assume we won't walk into walls or other barriers (presumptuous maybe, for me). If it appears we're about to do that, it will adjust the map back to a more logical track.

The app even knew when we were in an elevator; the pressure sensor in the Galaxy III can detect slight changes in atmospheric pressure that come with the increase in altitude. It tracked our progress from the first to sixth floor. Then Rothenberg used a simple virtual reality trick to recalibrate its position once we exited the elevator (he used the phone's camera to take a picture of a photo tag linked to a known location—taking advantage of another common smartphone technology for error correction). Then he guided us to a room down the hall.

It was impressive, but that's not to say there aren't a boatload of caveats. We were working within a very controlled environment—I never got to see what happens if you simply step off the prescribed route and start walking off in the wrong direction. Also, Rothenberg had to adjust his own stride at least once to catch up to where the app thought he was, which would be an absolute deal-killer in a consumer product. Still, the technology is at the point where I can definitely see the potential. From what I learned at CES this year, the MEMS sensors are reaching a turning point in accuracy and precision. Rothenberg estimates we should see large advances reach consumer products in just three years.

The impact could be huge—on par with that of GPS itself. Just think how valuable it would be to a store to be able to guide customers not just to the door but to an individual product. Imagine your shopping list turning into turn-by-turn guidance through the grocery store. As of my holy grail—easy navigation through conventions such as CES—I'll be willing to wait three years for that.