While preparing for an upcoming presentation at the annual meeting of the American Geophysical Union (AGU) I came across a topic that I thought might make an interesting blog post. The presentation is about using data from the Sentinel-1 mission for Earth Science applications. The Sentinel-1 spacecraft are C-band SAR systems launched and operated by the European Space Agency (ESA). An innovative aspect of this mission is that the collection scenario devised by ESA is systematic and very broad in coverage. Using Sentinel-1 we can monitor Earth using SAR data like never before.

The cloud has made it easier to process large amount of data, and satellite imagery processing benefits from cloud processing too. One of the cloud services that offers access to satellite images, and abilities to process them in the cloud – no more need to download it to your computer and process it there – is Amazon Web Services. If you’ve never worked with cloud processing, getting started with AWS can be a bit daunting. This tutorial gives beginners an introduction to accessing satellite images – Landsat and Sentinel-2 – on AWS.

Sentinel-2 is the optical satellite of the Copernicus programme. It can be compared to Landsat, although it has a better resolution, of 10 to 20 meters. We’ll be using it for crop monitoring with simple vegetation indices.

SAR images can see through clouds and in darkness, and are therefore very useful for operational monitoring of our seas. Detecting ships, icebergs, wind patterns, and oil spills is daily business in Europe with the Sentinel-1 satellite. Want to see for yourself how to extract information from a SAR image? In this tutorial, we’ll use the SNAP toolbox for Sentinel-1 to extract information on the number of ships at sea.

The earth observation satellite Sentinel-2 with Sentinel-2A and Sentinel 2B was launched on June 23rd 2015 from space centre Kourou in French Guiana. The mission is part of the Copernicus mission by ESA. The liftoff was recorded by ESA and can be watched here. Sentinel will be a complement to the multispectral satellites of the Landsat mission and SPOT. All observation systems in combination reveal a higher temporal resolution by shifted orbits and large stripe widths (Sentinel: 290km). Sentinel has a repetition rate of 10 days, Landsat 8 of 16 days but in combination with Landsat 7 a 8 day…

Since we are still in the International of Year of Soils , another Digital Soil Science post is ready. Some of you might already know that I’m addicted to soil. As a guy who works in the field of soil erosion modelling, any kind of soil data is interesting for me and some are quite relevant. Especially soil texture and soil moisture are very crucial values about predicting soil stabilization. I showed a global soil database in a previous post, which contains many high resolution soil data. But guess what: Some more soil data is coming. This time the datasource is…

The White House announced on September 23th during the United Nations Heads of State Climate Summit in New York that they are going to release the high-resolution images of SRTM globally. Until now, the high-resolution imagery were only available for US-areas. All other countries had lower resolution imagery. Due to this release the 30m x 30m imagery will be available globally and substitute the 90m x 90m data. After its release the data will be accessible on EarthExplorer by the US Geological Service (USGS). Unfortunately they didn’t announce an exact release date (only “will be released globally over the next year”). That…

The Sojus TMA-13M started on May 28th 2014 from Baikonur to bring the German Alex Gerst, the Russian Maxim Surajew and the American Reid Wiseman to the ISS (International Space Station). Besides the task of the crew to make some experiments etc. on ISS, they show the public how vulnerable and beautiful our blue planet is. There are different ways they try to do it: Regular public calls and interviews to the crew members bring the earth population in contact with the members. If you’re following one of those members on Twitter or Facebook I’m sure you’re impressed by the pictures…

When it comes to data we are more or less lost nowadays. We can acquire more and more data for a current area and find answers on our questions. The Principal Component Analysis (PCA) can help you to enhance your understanding your data and to reveal underlying information that influences your data fundamentally. Since some days there is a special plugin for QGIS available that enables you to determine principal components from your data.

The sub-title to this post could easily be “reasons to love DLR #123 (transparency)”. You’ll see why in a moment. In this post I am going to do a very brief review of the civilian hyperspectral (or imaging spectroscopy) missions, operating or planned for the near future, that will provide Earth Observation (EO) data. Let’s start in the past. Hyperspectral imagers (HSI) were first launched into space at the start of the millenium. First NASA launched Hyperion on the EO-1 platform in 2000, then ESA launched CHRIS on Proba in 2001, kick-off an exciting new millenium of advanced EO platforms.…

Following sequestration and bitter budget wars in Washington NASA find itself in a difficult position. NASA has proposed a budget of $17.715 billion, down from $17.893B last year. The House passed a bipartisan bill this last week: slight increases in Earth Science are proposed as are cuts in Planetary Science. Exploration receives more than $300 million more in FY2014 than FY2012; Space Operations slightly less in FY2014 than FY2012. The International Space Station continues to deplete budgets by more than $3B a year (rising to $3.5B by 2018). Planetary Scientists are not happy with the budget. Really not happy (though it should be noted that…

Since the start of Google Earth in 2001 (it was re-released in 2005 after Google acquired it from Keyhole, Inc ) remote sensing images made their way into every day life. But if you are looking at the images they show on Google Earth or Google Maps: You will see historic data and it strongly depends on companies like Digital Globe, BlackBridge (a.k.a. RapidEye) or Astrium and the contracts with Google whether you will see new or old data.

In a previous post I talked about some of the remote sensing themes during the first half of the American Geophysical Union Fall meeting. Here I want to talk about the latter part of the week at AGU and the headline remote sensing topic: vegetation time series. In the early 1970s Compton J. Tucker investigated the spectral reflectance of prarie grass, leading to a host of publications in the mid to late 1970s on spectral reflectance of vegetation, the use of multispectral satellite imagery for vegetation monitoring and, ultimately, the Normalized Difference Vegetation Index (NDVI). At AGU we saw that NDVI…

The Fall Meeting of the American Geophysical Union is one of the largest annual gatherings of scientists. Some 22000 have descended on San Francisco to discuss topics from the magnetosphere and heliosphere to river erosion mapping using airborne laser scanning and the use of GRACE for water budget estimation in the Colorado River basin. Here some of my highlights from the first half of the meeting: Google is coming (cheer or run for cover as appropriate). Google have laid out their new Earth Engine, an online, in the cloud, satellite image processing system. Yup, they really are taking over the…