While I contact my lawyer about possible Google royalty checks, it is great to see this capability brought to life - and I really like the blend effect that is presented when you click between each immersive sphere to give the sense of warping / traveling between each position.

What would be really fantastic would be for Google to release information on how this is done - particularly if you have an ImmersiveMedia set up of your own - but even if you just have an IPIX system it would be extremely helpful.

However, right now it is still more like a peapod than a hotdog - you can jump from pea to pea and turn the sphere into a mercator projected photo, but the true geometry that is collected in each sphere is not being projected out to the correct positions yet to create the solid hotdog.

How can we make that work? Is a new PhotoOverlay geometry type needed?

Sunday, January 6, 2008

On my way down to Miami to watch Virginia Tech hand Kansas University a victory in the 2008 Orange Bowl, I set my GPS logger up near my window seat and periodically snapped some shots out of the window.

I was pretty pleased with the results, particularly the shots coming in from the Atlantic Ocean, over Miami Beach and into MIA!

Hmm... sort of like Microsoft's BirdsEye....

And, it got me thinking - could you imagine if instead of 88 shots over 2 hours from one window, I had the ability to record 30 frames per second from 12 lenses 360º around the plane for the full 2 hour flight?

That would give me 1,296,000 georeferenced images worth of data .

Then I started thinking about an NPR piece that I heard back in November that mentioned the scale of the number of flights flown each day across the US.

(Watch a beautiful video by Aaron Koblin depicting US Air Traffic Below)

[youtube=http://www.youtube.com/watch?v=dPv8psZsvIU]

So those 1 million + georeferenced images would come from just one of roughly 20,000 flights flown over the US each day...imagine if each of those flights had a similar system!

I think this would be fairly easy to accomplish if you took something like the Immersive Media Dodeca System and distributed the sensors around the airplane (maybe as simple as splitting the camera in half along the equator and mounting one half on top and one half on the bottom of the plane, or splitting the lenses out all around the plane.)

As far as data processing and storage is concerned, I'm not even sure that capability is possible right now - but imagine all of this data available to update Virtual Earth or Google Earth with near real time immersive imagery from multiple angles and altitudes - all time-stamped so that you could observe changes in construction, population movements, traffic, etc over time.

Would the data be collected when the plane lands, or transmitted via some sort of tether to a central processing station?

The cost of doing something like this at first would seem to be incredibly high, - the electronics and sensors required currently aren't cheap but if purchased on this scale they may not be too outrageous. However, the platform (commercial airliners) are already flying en masse every day - there is practically no cost associated with the actual flight of the sensors - only in their initial equipment purchase and the infrastructure / storage / processing system.

Would it be worth it to the US to try to organize such an effort?

How much is currently spent on similar data collection for the USGS and other national agencies - could this capability provide more accurate and denser data than traditional satellite based sensor systems?