Have you ever thought about all the content and media out there on the Web? Information that we use daily to help make decisions, like reviews of the best restaurant for chicken parmesan or the t-shirt shop perfect for a vintage G&R t-shirt? We all know that you can find nearly anything on the web in the form of blogs, websites, user reviews, tweets, etc.

But the Web is changing. In the past we have been forced to ‘disassociate’ all this content from its physical context. Sure, you can write a review of that great B&B. But once you do that, it escapes into the ether and loses its connection to the real world – the place that gave you the idea to write it in the first place. We thought there was probably a better way to reconnect all this data with its home, to provide greater context, and to ultimately help you use it to make a better decisions about things you’re trying to get done in real life, not just on a search engine.

We began talking about this idea of Spatial Search a bit back in December when we launched our new version of Bing Maps and today at TED, Blaise Aguera y Arcas, Bing Maps architect will be unveiling some more work that demonstrates how we’re reuniting data with context. Some of what we’re showing is in research stage, some you can use today, and some you’ll be able to use shortly. But before we geek out on the features, it might be helpful to frame in a context. So, let’s do that…

The idea behind Spatial Search came from looking at human psychology and trying to understand how we as humans make decisions. We use all our senses: sight, touch, smell, sound, and taste. Today’s blue-link model doesn’t do a good job in tapping any of those senses and instead made you visually recreate models in your head to get through complex tasks. In other words, when you read a review of that French bistro, you had to manufacture everything: the street location, the ambience, where exactly in the city it was, and more. While we’re not working on smell-o-search (yet!), we do think we can do a better job with the ‘sight’ and ‘touch’ senses you rely on every day.

So when we think about Spatial Search, we think about the modes you all go through when you’re out and about, interacting with people and places rather than machines. First, you Explore: you orient yourself. You get a feel for what’s around and figure out your environment. Next, you Discover. Using your senses and visual cues you try and make sense of your surroundings. You look at storefront signage, street signs, and other clues to let you know where you might want to go to get your task done. Finally, you Decide: you take in and process all of the input around you to try and make the best decision for the task at hand. Maybe it’s the opening hours of the dry cleaners (oops – its closed) that tells you that you need to go to the bakery you can smell around the corner until it opens, and hop on their wi-fi network to get some work done while you wait. The challenge for Spatial Search is how we use technology to augment all those tasks, bring context to you so that information transforms into knowledge, which leads to actions taken to make your life easier and more informed.

For Exploring, we’ve made a number of updates to the Bing Maps platform to bring this idea to life through high resolution imagery from outer space all the way down to the front door of the bakery. Today at TED, we’re announcing a next step in this evolution of making this imagery more useful and interactive with the release of the technology preview of the Streetside Photos application. This tech preview mines geo-tagged photos from Flickr, and relates them to our Streetside imagery to show images matched to its original spatial context. Why is this cool? You’re now able to see what that club looks like at night (is it really THAT scary?), see if you’re really going to get a good sunset at that B&B you’re looking to book, or check out the crowds on a Saturday morning at Pike Place Market in Seattle or get a view of the same market from decades prior. As more people share imagery, our challenge is to reunite those photos with where they were taken – again, provide context to the data in the ether. Watch Blaise’s demo to see Streetside Photos in action.

(Please visit the site to view this video)

But we’re not just stopping at the street. Today, we’re also excited to demonstrate integration with the WorldWide Telescope, a project out of Microsoft Research. Once launched, you will be able to walk outside in Streetside mode, look up, and see what’s above – way above – right now where you’re standing. Constellations come to life as you pan – you can even set the time of day so you can see what you’ll see at 9pm – great for exploring with your daughter to get her ready for what she’ll see when the sun goes down.

At the same time as we’re getting more “universal” with World Wide Telescope, we’re also getting more intimate. At TED, Blaise showed the first results of our indoor panoramas work. This will provide an experience identical to Streetside, but won’t be limited to places you can take a vehicle. Whether you’re exploring Seattle’s Pike Place Market, or your favorite theme park, Bing Maps will give you the most immersive experience of the place. We’ve already given you a taste of this with the integration of Photosynths into Bing Maps (18,000 of them and counting), and you can expect Photosynth and Streetside to converge in a way that allows all of us to document the important places in the world – indoor or outdoor – and explore them in a completely natural way.

When it comes to Discovering, we launched our “Map Apps” gallery to bring that disconnected data home. From our Local Lens to Twitter, we’re bringing data back to where it can help you discover what’s in a physical area. And sometimes, there’s no substitute for absolute real-time. At TED we also demonstrated live Webcam feeds perfectly which enables real-time video to be overlaid seamlessly on street-level imagery, adding another dimension to the mapping experience. Imagine – you can see how long the line is at Five Guys before you head over for a burger. In the coming year, we think you will be pleasantly surprised with how far Bing takes this new technology. Stay tuned.

Finally when it comes to Deciding, we’ve just scratched the surface with Bing maps. We introduced innovations around our Opinion Index which lets you see just how good that Vietnamese Pho restaurant is based on what the community is saying. Our next step is to continue to augment the Spatial Search experience with these types of data. The potential for ‘augmenting’ your physical world with data pulled from everywhere, in real-time, and in context is exciting. The chance of getting lost diminishes greatly. You may never have to enter a fabric store only to realize you should have gone somewhere else to get your remnant. The ability to re-route before hitting a road closure due to construction will be right at your fingertips. The technology to enable the Deciding pillar of our Spatial Search strategy is daunting – but this is a key element in our quest to better understand how we naturally process information to make a decision. So that’s where we’re headed.

It’s going to take a little while to get there. But we believe our focus on your needs as searchers and more importantly as people will ultimately help us build technology to reconnect the wandering data to where it wants to be – the real world.