Google has been adding new imagery to Google Earth, but has not updated historical imagery for over a month, so we are unable to do maps of the new imagery or see any of the added imagery that is not in the default layer.

We had a look at different ways to use AI image recognition with Google Earth imagery. There are a lot of interesting potential applications using the satellite and aerial imagery as well as Street View imagery.

We had a go at watching sand dunes move with the aid of historical imagery. The biggest difficulty we had was finding suitable locations, as most deserts have very little historical imagery. We were, however, able to find some examples of sand dunes moving.

We had a look at the last ten years of imagery updates for the continental US and for Europe and used this to estimate how frequent imagery updates can be expected in those regions. (Every three years for the Continental US and five to seven years for Europe.)

We found some multi-coloured patches of snow in various places that have been introduced as part of Google Earth’s new global moasaic. We believe it has to do with a bug in the way the imagery is processed for the transition between the global mosaic and the higher resolution imagery displayed when zoomed in.

With the arrival of NASA’s Juno probe at Jupiter we discussed why a ‘Google Jupiter’ would not work the same way as Google Earth or Google Mars. Jupiter simply doesn’t have a mapable surface. Google Jupiter would be closer to a weather map than a ground map.

We created a tool that makes use of the Google Earth API to check whether placemarks have imagery after a given date. This is useful if you have a large number of placemarks and you want to check for recent imagery in their locations.

A number of readers reported that the Weather layer in Google Earth is broken. It only affects the ‘Conditions and Forecasts’ layer. It is still broken, with the exact same data showing for the places we looked at when we wrote the post.

The last few years have seen major advances in computer artificial intelligence (AI). One area where AI is starting to show practical use is in imagery recognition. Google Earth and Street View imagery combined with image recognition has a wide range of possible applications. We have in the past had a look at Terrapattern, an experimental search engine for aerial and satellite imagery. They are adding new areas with time, so be sure to keep an eye on them.

We recently came across this story about a Caltech researcher that is helping the city of Los Angeles to count its trees with the help of a combination of Google Earth imagery and Street View. In this case they are trying to not only count individual trees but also identify the species.

The idea of using imagery for surveys of vegetation is of course far from new. Google Earth Engine, for example, is designed around such large scale analysis. When you wish to simply determine whether there is vegetation cover or possibly the overall health of the vegetation, a much better option than Google Earth imagery is to use false colour imagery – and satellites are typically designed with this in mind.

There is also this project, which uses Street View to geolocate an image. You could potentially take a photo with your mobile phone camera and the system could tell you where you were with accuracy similar to GPS. At present, this sort of thing is often done by crowd-sourcing rather than an automated system. The potential for automated systems has both potential benefits and serious privacy concerns.

Google itself applies some image recognition to Street View. The best known is identifying licence plates and faces, which are blurred for privacy reasons. However, it also reads house numbers and various street signs, and this information is used to improve Google Maps.

If Google were to add infrared to their Street View cameras, maybe it would make it easier to distinguish between faces of people who need privacy and faces of statues who need publicity.

Yesterday we had a look at the orbits of imaging satellite’s from the perspective of a stationary earth. Today we are having a look at the same orbits but showing how the orbit is actually a circle with the earth rotating inside it.

We found a model of Landsat 7 on the sketchup 3D warehouse and have created a tour showing what Landsat 7’s orbit looks like. The satellite is not shown to scale but the orbit should be approximately correct. Landsat 7 crosses the equator from north to south at about 10:00 am every 98.83 minutes (yes, it’s confusing). Its orbit covers the entire earth every 16 days and then repeats.

One problem we have encountered is animating a model across the antimeridian does not work correctly in Google Earth. We have not yet found a work-around. You will notice the model appears to jump occasionally when crossing the antimeridian. Another bug is that the background of stars shakes around when playing the tour. The stars should be stationary relative to the view, as the satellite’s orbit is nearly stable with respect to the stars, drifting approximately 1 degree per day (360 degrees per year).

Here we see Landsat 7’s orbit over the course of 24 hours:

You can view it in Google Earth with this KML file. For best results turn on sunlight (the icon with a rising sun on the toolbar). We also include in the KML the orbit for 24 hours or for the full 16 days.

Note that Landsat 8 shares the same orbit but with an 8 day offset.

This is what its 16 day orbit looks like relative to the earth:

We couldn’t record the full 16 day orbit as a tour as Google Earth couldn’t handle it. We believe it is possible to use a KML feature called a Track to improve performance, but we have not yet figured out how to do that.