How to See the Invisible

Everybody’s amazed by touch-screen phones. They’re so thin, so powerful, so beautiful!

But this revolution is just getting under way. Can you imagine what these phones will be like in 20 years? Today’s iPhones and Android phones will seem like the Commodore 64. “Why, when I was your age,” we’ll tell our grandchildren, “phones were a third of an inch thick!”

Then there are the apps. Right now we’re all delighted to do simple things on our phones, like watch videos and play games. But the ingredients in the modern app phone—camera, GPS, compass, accelerometer, gyroscope, Internet connection—make it the perfect device for the next wave of software. Get ready for augmented reality (AR).

That term usually refers to a live-camera view with superimposed informational graphics. The phone becomes a magic looking glass, identifying physical objects in the world around you.

If you’re color-blind like me, then apps like Say Color or Color ID represent a classic example of what augmented reality can do. You hold up the phone to a piece of clothing or a paint swatch—and it tells you by name what color the object is, like dark green or vivid red. You’ve gone to your last party wearing mismatched clothes.

Other apps change what you see. When a reader sent me a link to a YouTube video promoting Word Lens, I wrote back, “Ha-ha, very funny.” It looked so magical, I thought it was fake.

But it’s not. You point the iPhone’s camera at a sign or headline in Spanish. The app magically replaces the original text with an English translation, right there in the video image, in real time—same angle, color, background material, lighting. Somehow the app erases the original text and replaces it with new lettering. (There’s an English-to-Spanish mode, too.)

Some of the most promising AR apps are meant to help you when you’re out and about. Apps like New York Nearest Subway and Metro AR let you look down at the ground and see colorful arrows that show you which subway lines are underneath your feet. Raise the phone perpendicular to the ground, and you’ll see signs for the subway stations—how far away they are and which subway lines they serve.

When you’re in a big city, apps like Layar and Wikitude let you peer through the phone at the world around you. They overlay icons for information of your choice: real estate listings, ATM locations, places with Wikipedia entries, public works of art, and so on. Layar boasts thousands of such overlays.

There are AR apps that show you where the hazards are on golf courses (Golfscape GPS Rangefinder), where you parked your car (Augmented Car Finder), who’s using Twitter in the buildings around you (Tweet360), what houses are for sale near you and for how much (ZipRealty Real Estate), how good and how expensive a restaurant is before you even go inside (Yelp), the names of the stars and constellations over your head (Star Walk, Star Chart), the names and details of the mountains in front of you (Panoramascope, Peaks), what crimes have recently been committed in the neighborhoods around you (SpotCrime), and dozens more.

Several of these apps are not, ahem, paragons of software stability. And many, like Layar, are pointless outside of big cities because there aren’t enough data points to overlay.

As much fun as they are to use, AR apps mean walking through your environment with your eyes on your phone, held at arm’s length—a posture with unfortunate implications for social interaction, serendipitous discovery and avoiding bus traffic.

Furthermore, there’s already been much bemoaning of our society’s decreasing reliance on memory; in the age of Google, nobody needs to learn the presidents, the state capitals or the periodic table. AR apps are only going to make things worse. Next thing you know, AR apps will identify our friends using facial recognition. Can’t you just see it? You’ll be at a party, and someone will come up to you and say, “Hey, how are you—” (consulting the phone) “—David?”