USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

November 19, 2018

We are not on track to constrain global temperature rise to 2°C let alone 1.5°C. Therefore we need some innovative technologies to get us back on track. At the Geography2050: Powering our Future Planet conference at Columbia University in New York, several presentations described innovative, potentially disruptive technologies that could dramatically change how electric power is generated and transmitted.

Wireless power transmission

General Rick Deveraux of Viziv Technologies described a way of transmitting electric power wirelessly using a technology called a Zenneck surface wave that orovides a direct wireless connection from a generator to a load. It supports much higher field strengths than conventional Hertzian waves and follows the Earth's curvature. Viziv is currently working in a global demonstration project that involves building a fiberglass transmission tower and field intensity monitoring stations around the globe. An advantage of this technology is that it could deliver power to people who are currently off grid. Testing of this configuration starts within 30 days and actual power delivery next year.

On demand emissionless power generation

Christofer Mowry of General Fusion, described how nuclear fusion can provide energy dense, low environmental impact, manufacturable, and dispatchable power generation. In other words a fusion installation requires very little land and can be placed near the sources of load avoiding long haul transmission lines. There are no emissions and very little waste. It does not require fuels that are constrained by supply like fission does. It can supply large volumes of power on demand. The amount of investment in fusion has increased dramatically since about 2007.

Cell phones not requiring recharging

Frank Prautzsch of Velocity Technology Partners described what is potentially a disruptive technology that could affect everyone carrying a cell phone. Thermionic energy conversion (TEC) provides a way of generating power from ambient thermal energy using nano technology. Developed by Birmingham Technologies the Nano-Boxx consists of two metal plates composed of different metals that are placed less than 10 nanometers apart with a nanofluid in between. An electric current is generated when electrons from one plate vaporize and collect on the other plate. The device that was shown was the size of a postage stamp. By stacking them the device can scale from milliwatts to megawatts. 8 to 9 of these will power a cell phone, 800-900 a satellite. It can produce power for about eleven years without charging. To date it has been tested by using it to power an LG Nexus cell phone for the past few years. It is cheaper to produce than a lithium ion battery, has 40% more energy density, and has no emissions.

Powering the world for a million years

Kevan Weaver of the Idaho National Laboratory outlined the results of their calculations that advanced reactors would enable depleted uranium stockpiles and known reserves of uranium to supply 80% of the world's electric power demand for about 2000 years with no carbon emissions. If you add the uranium in the oceans there is enough to provide a source of power for a million years. China and India are investing heavily in fission power generation. Micro-reactors and small modular reactors can provide safe power to data centers, remote locations that are off-grid, and locations where fossil and other fuels are expensive. The latest reactors (generation III and III+) are much safer than the first and second generation reactors like those at Fukushima, primarily because they do not require an external source of power to cool them in the case of an emergency shutdown.

May 30, 2018

Deep learning algorithms have been developed by academia and as a result the code for the most part is open source but the successful training of deep networks requires thousands of labeled training samples and at the present time this training data is typically not open. In their presentation at this year's FOSS4GNA (Free and Open Source Software for Geospatial North America) get together in St Louis Jason Brown and Courtney Whalen, data scientists at astraea, showed how deep learning using publicly available labeled data for training was applied to track deforestation and reforestation in Mato Grosso, a state in the central amazon region of Brazil.

Chris Holmes of Planet Labs, in his insightful talk at FOSS4GNA in St Louis about the application of deep learning to geospatial data, identified a challenge in making this technology open. The deep learning algorithms have been developed by academia and as a result the code for the most part is open source. For example, a deep neural network model developed originally for medical image segmentation called U-Net is open source and has been applied to identifying building footprints. Successful training of deep networks requires thousands of labeled training samples. Labeled data involves people on the ground manually ground-truthing land use types and other features so that the deep learning algorithms can learn what to recognize. At the present time this training data is typically not open source. In this presentation by Jason Brown and Courtney Whalen, both data scientists at astraea, deep learning using publicly available labeled data for training was used to track deforestation and reforestation in Mato Grosso, a state in the central amazon region of Brazil.

This is computationally intensive and a distributed engine was used. The computation engine used open source components. Spark is a top level Apache project which enables distributed processing for global scale computation. RasterFrames is a free and open source toolkit allowing scientists, data scientists, and software developers to process and analyze geo

Forest cover in Mato Grosso in 2002

patial-temporal raster data with the same flexibility and ease as any other data type in Spark DataFrames. This is a LocationTech raster project and is built on GeoTrellis. Using this software each year required 6 to 7 hours of computation using 48 cores.

The imagery that was used was captured by the MODIS satellite for the years 2001 through 2017. MODIS monitors the reflection back from ground cover for several bands including red, green, blue, short wave infrared and near infrared. Its cameras have a spatial resolution of 500 by 500 meters and a revisit rate of one to two days. From the bands that it captures the normalized difference vegetation index (NDVI) can be calculated. From the data monthly means and yearly aggregates can be calculated.

Forest cover in Mato Grosso in 2011

The training data used came from the System for Terrrestrial Ecosystem Parameterization (STEP) which has 2000 manually labeled sites covering 17 different land cover types including five forest types scattered across all continents. The model was trained on MODIS 2012 data. 80% of the data was used for training. After training was completed the remaining 20% was used to test the model.

Comparison of rate of forest loss with Global Forest Watch for 2001 to 2016

After training and testing the first application was to Mato Grosso in central Brazil, a large state that has seen a lot of deforestation. The rate of deforestation tracked the rate estimated independently by the Global Forest Watch for the years 2001 to 2017. The major feature, the slowing down of the rate of deforestation in 2011 probably as a result of increased enforcement by the state government, is very clearly discernible.

The successful application of the deep learning technology in Mato Grosso has encouraged astraea to aim at applying this approach globally. They also intend to use satellite data with higher resolution and to handle seasonal differences better.

About STEP

The System for Terrestrial Ecosystem Parameterization (STEP) is a model for deriving vegetation and land surface parameters from remote sensing data for use in remote sensing-based classification of land cover, ecosystems, and vegetation types. The model defines parameters that relate to important ecological and biogeophysical parameters and that can be reliably measured or inferred from remote sensing, collateral, and field plot data. STEP is maintained as a database of training polygons drawn on high spatial resolution imagery that can be extracted with GIS to produce a global land cover classification. STEP is periodically reviewed to filter out inconsistent sites and augmented to fill gaps in biogeographical coverage. The database was originally created to follow the International Geosphere-Biosphere Programme (IGBP) land cover legend but it has since evolved to support any number of additional classifications.

September 27, 2017

Reduce greenhouse gas emissions by at least 20% from 1990 levels by 2020

Increase the share of renewable energy sources in energy consumption to 20%

20% increase in energy efficiency

The legislation driving the 2020 goals are the Effort Sharing Decision, Renewable Energy Directive. and the Energy Efficiency Directive. The EU is on track to meet the 20% target for 2020. In 2015, EU emissions were already 22% below 1990 levels. According to national projections, emissions will further decrease until 2020 and showing a significant overshoot of EU 2020 goals.

As an example the largest member state, Germany, has about 1.7 million non-residential buildings. Of these office and administration buildings comprise the largest share(22%), retail (14%), agricultural (14%) and hotels, cafés and restaurants (13%). Public buildings of the federal, state and local Government are about 20% of all non-residential buildings by floor area. Efforts to reduce energy consumption in buildings 2000 to 2012 resulted in a decline in space heating consumption from 205 kWh/square meter/year in 2000 to 147 kWh/square meter/year in 2000 in 2012. The central goal of the German Federal Government is to further reduce the heating requirements of new and existing buildings to achieve a nearly carbon neutral building stock by 2050.

Under the Paris agreement, the EU has agreed to reduce its greenhouse gas emissions by at least 40% by 2030. In addition, the EU also has targets for renewable energy and energy savings of at least 27% by 2030. The Paris Agreement obliges the EU to review its legislation to achieve the 2030 targets before 2018 when world leaders will gather to assess progress in its implementation. According to Member States' projections based on existing measures, in 2030, the total EU emissions are estimated to be 26 % below 1990 levels. Therefore, new mitigation policies are being put in place so that the EU target of at least a 40 % domestic reduction in greenhouse gas emissions compared to 1990 in reached in 2030. A 2017 survey has found that nearly 90% of EU citizens believe it is important for their national government to provide support for improving energy efficiency by 2030.

Energy efficient buildings have been one of three priorities in the EU to meet 2020 emissions goals and this will be intensified to meet the 2030 goals. The Energy Performance of Buildings Directive (EPBD) and the Energy Efficiency Directive (EED), have been the legislative drivers for improving the energy efficiency of buildings. In Western Europe intelligent building technologies and net zero energy buildings are key to meeting energy efficiency goals. A new Navigant Research report analyzes the market for energy efficient building technologies in Western and Eastern Europe with a focus on HVAC, lighting, building controls, water efficiency, water heating, and building envelope. According to Navigant Research, energy efficient building technology spending in Western and Eastern Europe reached $83.5 billion in 2017. It is projected to grow steadily to $111.9 billion by 2026.

August 17, 2017

Last week at a Goldschmidt conference in Paris it was reported that ice cores captured in Antarctica have been dated to 2.7 million years ago. Previously the oldest ice cores from Antarctica from the Dome C site, which have provided the best information we have about past temperature and atmospheric composition, were limited to the past 800,000 years.

July 12, 2017

The world's global coal-fired power generation plants provides around 40% of the world’s electricity. This fleet is the youngest it has been for decades, with more than 500 GW added since 2010, mostly in emerging economies. To simply shut these new and in many cases highly efficient plants down to meet climate goals would be a political, social and economic challenge. According to the IEA retrofitting these plants with carbon capture and sequestration (CCS) will need to be a key strategy in many regions.

There are now 17 large-scale CCS projects operating around the world, with at least one more due to be commissioned this year. China has just announced construction of its first large-scale CCS project in the coal-chemicals sector and with seven further projects under early development. The Petra Nova project in Texas was delivered on time and on budget earlier this year. Petra Nova is the second, and much larger, project to retrofit post-combustion CCS technology to an existing coal-fired power station, with costs reportedly around 20% lower than the world-first effort at Boundary Dam in Canada. On the other hand work on the most ambitious carbon capture and storage (CCS) plant in the world at the Kemper County Energy Facility in MIssissippi has been suspended. It turned out that the rapid scale up of a new integrated gasification combined cycle (IGCC) technology from pilot to a 585 MW plant proved to be far more complex and challenging than anticipated.

July 10, 2017

Immersed concrete structures, which were constructed in Roman harbours from about 55 BCE to 115 CE, have remained intact for 2000 years. In contrast modern concrete structures begin to deteriorate after several decades. Caesarea was one of the first projects to use Roman underwater concrete on a large scale. The ingredients in Roman concrete were known by ancient writers. Vitruvius, a Roman architect and engineer, in De architectura specified a ratio of 1 part lime to 2 parts volcanic ash for underwater work. He described the chemical reaction when tuff (an aggregate), volcanic ash from the Campi Flegrei and Vesuvius volcanic districts and lime “come into one mixture and suddenly take up water and cohere together”. Pliny the Elder wrote in Naturalis Historia that “as soon as volcanic ash comes into contact with the waves of the sea and is submerged it becomes a single stone mass, impregnable to the waves and every day stronger”.

A recent study appearing in American Mineralogist has revealed the minerals and chemical reactions responsible for the strength and durability of Roman concrete. Roman marine structures were analyzed with synchroton-based X‑ray microdiffraction at Lawrence Berkeley National Laboratory to identify the chemical composition and crystalline structure. Raman spectroscopy was used to study the chemical bonding. The analysis revealed the presence of aluminum tobermorite, a rare mineral, in Roman concrete. The process that makes Roman concrete harder over the centuries is the crystallization of phillipsite and aluminum tobermorite in the presence of water and at low temperature. This process occurs slowly and begins after the initial hardening of the concrete.

Another benefit of Roman concrete is reduced carbon dioxide emissions. Cement manufacture is a major source of carbon dioxide. Modern Portland cement releases carbon dioxide both from burning fuel to heat limestone and clay to 1,450˚ C and from the chemical process that converts the heated limestone (calcium carbonate) into lime (calcium oxide). It has been found that Roman concrete required much less lime and the lime could be derived from limestone baked at 900˚ C or lower, requiring far less fuel than Portland cement.

March 28, 2017

Emissions Associated with Electric Power Production

Methane represents 9.9 % of U.S. greenhouse gas emissions and is at least 20 more times more potent in warming the atmosphere than carbon dioxide.

It has been asserted that natural gas power plants, which are responsible for 21% of power generation in the US, produce 50% less emissions than coal-fired plants. However, in the past the amount of methane gas that is lost (vented or leaked) during production, distribution and in plants has not been included in the calculation. Now a new study suggests that methane emissions from power plants and refineries may be much larger than current estimates.

A recent EPA analysis doubled its previous estimates for the amount of methane gas that leaks from pipes and is vented from gas wells, which significantly changes the emissions picture. Methane (CH4) levels from hydraulic fracturing of shale gas were found to be 9,000 times higher than previously reported. Based on the new numbers, the median gas-powered plant in the United States is estimated to be 40 % cleaner than coal-fired plants, according to calculations ProPublica has made. In addition about half of the 1,600 gas-fired power plants in the US operate relatively inefficiently. In the past these plants were estimated to be 32 % cleaner than coal, but with the revised EPA estimates, these ~800 inefficient plants are estimated to produce 25 percent less emissions than coal.

But there is another issue. Methane is one of the more potent greenhouse gases for global warming, but it is not clear just how much more potent methane is than CO2. The EPA has estimated a factor of 21 times compared to carbon dioxide. But Robert Howarth, an environmental biology professor at Cornell University, has suggested that it is actually 72 times as powerful as carbon dioxide in terms of its warming potential. This is critical, because if the climate effect of methane from natural gas is 72 rather than 21 times that of carbon dioxide from burning coal, natural gas may even turn out to be worse than coal in terms of global warming. Howarth has suggested that the type of shale gas drilling taking place in Texas, New York and Pennsylvania generates particularly high emissions of methane and could be as dirty as coal.

New estimates of methane emissions from power plants

Now a new study reports on overflights of power plants and refineries and finds that methane emissions are much larger than current estimates. Power plants and oil refineries are large consumers of natural gas. The EPA has collected data contributed by operators and estimated the methane (CH4) emissions from these plants, but there is high uncertainty in these estimated. In this study an airborne chemistry lab was used to estimate the methane emissions from three gas-fired power plants and three oil refineries. The average methane emission rates were larger than than the operator-reported estimates by 21 to 120 times for the power plants and by 11 to 90 times for the refineries. By looking at the pattern of methane emissions compared to carbon dioxide (CO2) and water vapour, the researchers were able to determine that the methane emissions were primarily from non-combustion processes suggesting leaks and venting as the sources. Scaling these result to the national level suggests that methane emissions from these types of facilities are 4.4 to 42 times larger than current estimates. The results indicate that gas-fired power plants and oil refineries could contribute significantly to U.S. methane emissions. The estimated contribution of 0.61 teragrams of methane annually (Tg CH4/yr) is significant, representing about 2% of total U.S. annual emissions of methane of about 30 Tg CH4/yr.

A recently published study has assessed the spatial distribution of anthropogenic methane sources in the United States by combining comprehensive atmospheric methane observations, extensive spatial datasets, and a high-resolution atmospheric transport model. Based on the results of this analysis the authors conclude that the EPA underestimates methane emissions nationally by a factor of about 1.5. Generally the study found that methane emissions due to the animal husbandry and fossil fuel industries have larger greenhouse gas impacts than indicated by existing inventories.

One of the motivations from switching from coal to natural gas for these types of facilities is that natural gas delivers the same amount of energy but with significantly reduced emissions. But that does not take into account leaks and other processes releasing methane during production, distribution and within plants. These latest results further reduce the advantage of gas-fired over coal-fired power production.

A study by Robert W. Howarth, Renee Santoro, and Anthony Ingraffea has concluded that the emissions of methane from shale gas wells are between 30% and 100% more than methane emissions from conventional natural gas wells. The study estimates that between 3.6% to 7.9% of the methane from shale-gas production escapes to the atmosphere in venting and leaks over the lifetime of a well.

As a result the study found that the greenhouse gas (GHG) footprint for shale gas is greater than that for conventional gas or oil. When compared to coal, which is responsible for nearly 50% of electric power generation in the US, the GHG footprint of shale gas is estimated to be 20% to 100% greater than coal over a 20 year period. Over a 100 years, the study concludes that the GHG impact of shale gas is comparable to coal.

March 27, 2017

Around 250 million measurements taken by CryoSat between 2010 and 2016 were used to create the most comprehensive 3D picture of Antarctic ice currently available. The satellite’s orbit takes it to latitudes within 200 km of the north and south poles – closer than other Earth observation satellites - which allows CryoSat’s radar altimeter to detect tiny variations in the height of the ice across the entire continent, including on the steeper continental margins where the vast majority of ice losses occur. The mission is also used to map changes in the thickness of ice floating in the polar oceans, which is particularly important for the Arctic.

The satellite measurements allow scientists to distinguish between changes in topography and ice motion when working with other satellite measurements, such as those used to calculate the balance between how much the ice sheet is gaining by accumulating snow and losing through melting and creating icebergs. The model will soon be freely available via the CPOM portal, which already provides information on sea-ice volume and thickness, ice velocity and, shortly, ice sheets. In the meantime, the model can be downloaded.

December 26, 2016

On December 21, 2016 China launched a 620 kg micro-satellite on a Long March-2D rocket to monitor its greenhouse gas emissions. The TanSat satellite is in a sun synchronous orbit 700 km above the earth and will monitor the concentration, distribution and flow of carbon dioxide in the atmosphere. TanSat will take readings of global carbon dioxide every 16 days, accurate to at least 4 parts per million.

Japan

On January 23, 2009, Japan launched the Greenhouse Gases Observing Satellite (GOSAT). It measures the densities of carbon dioxide and methane from 56,000 locations on the Earth's atmosphere. GOSAT was developed by the Japan Aerospace Exploration Agency (JAXA). The Japanese Ministry of the Environment and National Institute for Environmental Studies (NIES) use the data to track greenhouse gases.

U.S.

On July 2, 2014 NASA launched the OCO-2 satellite on a Delta II rocket to study carbon dioxide concentrations and distributions in the atmosphere. OCO-2 makes measurements in three different spectral bands over four to eight different footprints of approximately 1.29 km × 2.25 km each. About 24 soundings are collected per second while in sunlight. Of these about 10% are sufficiently cloud free for further analysis. One spectral band is used for column measurements of oxygen (A-band), and two are used for column measurements of carbon dioxide.

Canada

On June 21, 2016 Canada's GHGSat Inc. launched its Claire nanosatellite. Claire can measure emissions from any industrial site in the world. Greenhouse gases can be monitored with a precision in parts per million / billion, depending on the gas, with a geospatial resolution in tens of meters. On December 9, 2016, Claire measured emissions from a cement plant in South Africa, its 500th measurement.

December 10, 2016

The Canadian government has announced a a pan-Canadian policy on climate change including a floor price on carbon beginning in 2018. The Paris agreement commits Canada to a 30 per cent reduction in greenhouse gases from 2005 levels by 2030. The floor price of carbon starts at a minimum of $10 per tonne of GHG emissions in 2018, rising by $10 each year to $50 per tonne by 2022. All of the money raised will be returned to the respective provinces, who can, in turn, distribute those funds to their citizens in the form of tax cuts. The federal program will be compatible with existing provincial carbon pricing programs. Ontario and Quebec already have a cap-and-trade program. British Columbia already has a carbon tax of $30 per tonne. Alberta has announced it will have a $20 per tonne carbon tax beginning in 2017, rising to $30 a tonne in 2018. Eight of ten provinces have signed onto the agreement. The federal government said it will impose a carbon price on the two non-signatory jurisdictions, Saskatchewan and Manitoba. The final agreement includes a list of already announced measures including new building codes to improve building energy efficiency, more charging stations for electric cars, expanding clean electricity sources and upgrading power grids.