USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

October 30, 2018

A "GDAL Coordinate System Barn Raising" has raised $144,000 to enable improvements to coordinate system libraries needed for the GDAL, PROJ, libgeotiff, PostGIS, and Spatialite open source toolchain. Support came from 25 players in the geospatial market including SAFE Software and ESRI, Boundless, Azavea, LINZ, MapBox, Riegl, Radiant Solutions, and others.

The objective of this effort is to provide GDAL, PROJ, and libgeotiff with modern capabilities including support for the updated OGC well-known text standard WKT2, replacing CSV tables with SQLite-based databases, not requiring projection transformations to pivot through WGS84, and preparing for the new North American datum NATRF2022 and NAPGD2022 which I blogged about previously.

The use of a SQLite-based database for coordinate systems definitions will allow projects to add more capability, transition custom project coordinate systems to something more universally consumable, and promote interoperability between different software.

Well-known text (WKT) is a text markup language for spatial reference systems of spatial objects and transformations between spatial reference systems. The updated standard, known as "WKT 2", was adopted by the Open Geospatial Consortium and ISO in 2015. OGC WKT2 fixes longstanding interoperability coordinate system definition discrepancies and contains tools for describing time-dependent coordinate reference systems. Several countries are updating their geodetic infrastructure to include time-dependent coordinate systems. For example, Australia and the United States are adapting time-dependent coordinate systems in 2020 and 2022, respectively. The familiar NAD83 and NAVD88 in North America are being replaced by NATRF2022 and NAPGD2022, and the industry will have to adapt to these changes.

The new libraries provide for higher accuracy transformations between coordinate systems that avoid transforming everything through WGS84 and eliminate the extra steps that this entails.

Thanks to Dale Lutz of SAFE Software for pointing me to this during a chat at GeoAlberta 2018.

February 25, 2016

Keep a close watch on the Hawaiian power utilities (HECO) because Hawaii has committed to 100% renewables by 2045 and 65% by 2030. That was the recommendation of Sharon Allan, CEO of the Smart Grid Interoperability Panel (SGIP) at a DistribuTECH2016 MegaSession on distributed energy resources (DER) in response to a question from the audience asking what should utilities do to get ready for GRID 3.0.

Hawaii already has 487 MW of solar PV capacity, 90% of which is residential rooftop panels. At DistribuTECH2016, Talin Sakugawa from HECO and Matthew Shawver from in2lytics gave an insightful presentation on distributed energy generation (DER), big data and analytics at the Hawaiian power utility. Hawaii has seen an exponential rise in rooftop solar PV. As a result since 2010 the utility has been experiencing a reduction in net load and revenue. The most dramatic drop in load is during daytime hours as is readily apparent by comparing the utility's load on clear and cloudy days which reflects the impact of residential solar PV.

Challenges facing utilities

The Rocky Mountain Institute (RMI) has published an analysis of the economics of leaving the grid. The central thesis is that solar PV and electricity storage enables consumers to leave the grid completely. Solar PV has already reached grid parity in about 10% of the U.S. and declining PV prices suggest that this trend will continue. However, without the ability to store electric power, consumers with rooftop PV still require the grid for nights and cloudy days. The development of combined solar PV and batteries from Tesla and others promises to make solar + storage accessible for increasing numbers of consumers. These consumers could form their own microgrid, either by themselves or with their neighbours and disconnect from the grid. Alternatively they could become a source of dispatchable power and become an energy provider. Increasingly they could find that either of these options is economically advantageous. This has serious implications for local utilities because if this trend develops it will seriously erode the traditional utility revenue base. It could also lead to a completely decentralized grid comprised of many microgrids. From an operations perspective it increases the complexity of managing the grid compared to the centralized model in use today.

The other challenge for a utility with a high penetration of intermittent, distributed generation (DER) is load balancing, ensuring that generation meets demand. The German power industry, the U.S. Department of Energy, Hawaiian power utilities, California power utilities, ERCOT, and many others are working to address this challenge. HECO's presentation at DistribuTECH gave an insightful overview of what they are doing to meet this technical challenge.

Data sources

HECO collects a large amount of data from diverse sources. It is comprised of internal operational data plus data from esternal sources. It includes weather forecasts, data from customer sited PV, consumer/public resource data, phasor (PMU) data, renewables power quality, feeder data, irradiance meters, generator and substations power quality, SCADA, and vendor data such as forecasts and data from Independent Power Producers (IPPs). The data has widely different time resolutions ranging from real time such as PMUs sampling 30 times per second, SCADA reporting every 2 seconds, calculated gross load every 2 seconds, PV inverter data reporting every 5 minutes and aggregated monthly, weather forecast reporting every 15 minutes and aggregated daily, transformers collecting data every 15 mins and aggregated monthly, and smart meters reporting every 15 minutes and aggregated quarterly. Much of the data includes location. The data comes in different formats and has different levels of quality.

Fundamental to distributed energy are weather maps. Maps of irradiance and wind velocity can be directly used to estimate or predict solar and wind generation in different parts of the islands. Temperature maps are also important because the efficiency of solar panels depends on temperature. The more accurate the weather forecasts are, the more predictable the generation. Knowing that there it will be cloudy over the south of Oahu, but sunny over the rest of the island, or cloudy over the whole island, but with moderate to strong winds allows operations to estimate how much backup generation is required and where.

EMS Integration

Data from new sources relating to distributed renewable generation are integrated with traditional sources in a distributed energy management system which enables operations and planning visibility into how distributed generation is impacting thte grid. It not only provides a view into the status of the grid (situational awareness) but also allows simulations to assess the impact of different future renewable generation scenarios on the grid. This helps to determine where backup generation may be required because high demand and low generation from renewables or curtailment when there is excess generation and insufficient load. Location is fundamental to the analytics, because wind and irradiance varies widely over the islands.

Traditional input sources include SCADA and data from transmission, generators, protection, and so on. The new input data sources include GIS-based infrastructure models and data from the distributed generation sources. Also new is weather forecasting which enables predicting generation from renewable sources. It also includes historical and actual weather reports, and satellite and other weather data.

HECO uses an IT platform from In2lytics (a spinoff from Referentia) which integrates all of these time series data with EMS, SCADA, CIS, and GIS and makes it available to operational users, planners, modelers and the public. in2lytics is a high performance time series database that is specially designed to provide instant data accessibility for planning and operational decision making. in2lytics enables load, query, analysis using MATLAB, and sharing of the "big data" required to monitor and manage today’s complex electric grid. in2lytics has native high performance interfaces for MATLAB in addition to programming languages. It translates data from different sources into its internal time series database. All data is archived in a spatially enabled time series database for future analysis. Matthew said that in2lytics is able to analyze large volumes of historical time series data very rapidly. For example, a longitudinal analysis on two years worth of archived data can be run in an hour or two.

Applications

Accounting for renewables requires a lot of data. Some areas that use this data include generation planning, load forecasting, distribution planning, and contract administration. One of the first applications is a customer facing web site called Renewable Watch - Oahu that reports the total renewables generation, solar and wind, on Oahu in real-time.

An example of an application that will be used for contract administration beginning next month estimates how much power is lost from Independent Power Producers (IPPs) in the case of curtailment. It uses weather information to estimate wind speed and irradiance meters to estimate wind and solar generation potential.

Another analytical application was used to study how solar PV variability affects transformer tap changes. This application allows HECO to relate frequency of tap changes to solar PV variability and to determine which transformers are most affected by solar variability.

HECO has a number of projects with the Department of Energy and other partners.

Department of Energy's Integrating System to Edge-of-Network Architecture and Management" (SEAMS) A federally-funded research initiative on high penetration grids which aims to streamline grid planning and operations for utilities in regions with high concentrations of distributed generation (DG) resources.

Department of Energy's Distributed Resource Energy Analysis and Management System (DREAMS) Department of Energy system to provide useful information about the distributed grid to grid operators.

Making sense of synchrophasor data for utilities. The Synchrophasor Visual Integration and Event Evaluation for Utilities (SynchroVIEEU) with High Penetrations of Renewables. Accelerate the integration of synchrophasor information into production grade data visualization and analysis platforms/models. Leverage PMU capability at many substations – explore ways to tap resources and provide real-time visibility and real-time data. Make synchrophasor data accessible for efficient and reliable operations of a modern grid in light of high penetrations of renewable resources

Partners: SEL, DNV GL, Referentia

Monitoring, planning, and modeling high PV penetration microgrid. Decision support for microgrid design and operation at Marine Corps Base Hawaii. The Office of Naval Research (ONR) has launched an ambitious program to demonstrate and evaluate energy technologies using Navy and Marine Corps facilities as test beds, known as the Energy Systems Technology and Evaluation Program (ESTEP). Program management is being handled by the Space and Naval Warfare Systems Command (SPAWAR) Systems Center Pacific (SSC Pacific). ESTEP was established in 2013. It brings together the Department of the Navy, academia, and private industry to investigate and test emerging energy technologies at Navy and Marine Corps installations. At present, ESTEP conducts over 20 in-house government energy projects, ranging from energy management to alternative energy and storage technologies.

Other projects include

Remote light sensor technology at the Kahuku wind farm on Oahu to measure and forecast windspeed and direction to support reliability;

The ultimate objective for HECO is to become a company offering diversified services providing value to engaged customers, of course with satisfied regulators and sustainable costs and margins.

Big data and spatial-temporal analytics

It was clear from this presentation that essential for HECO, or to any utility, with a high penetration of solar or wind are weather maps, forecasted, actual and historical because these enable the utility to project generation. The also allow utilities to analyze the historical record in conjunction with other data to discover trends that may help improve forecasting. Collecting and analyzing temporal geospatial data is fundamental for distributed energy management. This is in addition to the expanded application of geospatial technology for utility operations and planning in the smart grid era.

December 23, 2014

Urbanization is a worldwide phenomenon. Smart city technology is becoming an essential element in the development of the world's megacities. For example, the new Indian government's just released budget includes an allocation for initiating the development of 100 smart cities. Songdo IDB in Korea and Fujisawa in Japan are two smart cities already under development. China has 36 smart cities in development and a low carbon model city in Tianjin. Singapore plans to become a smart nation by 2015. Iskandar is Malaysia's first smart city. The Delhi-Mumbai Industrial Corridor (DMIC) incorporates smart city concepts.

According to the report "Smart Cities Market - Worldwide Market Forecasts and Analysis (2014 - 2019)", published by MarketsandMarkets, the global smart cities market is forecasted to grow from $410 billion in 2014 to $1.1 trillion by 2019 at a compound annual growth rate (CAGR) of 22.5%. This includes smart homes, intelligent building automation, energy management, smart health, smart education, smart water, smart transportation, smart security, and related services. Most of this activity is expected to occur in Asia and the Middle East.

Navigant Research forecasts that cumulative global investment in smart city technologies, including smart grid, advanced water monitoring systems, transportation management systems, and energy efficient buildings, could total $174.4 billion from 2014 to 2023, growing from $8.8 billion annually in 2014 to $27.5 billion in 2023. Navigant forecasts that annual smart city technology investment in Asia Pacific will increase from $3.0 billion in 2014 to reach $11.3 billion in 2023.

What are cities already doing ?

In a recent study by Arup and University College London, Delivering the Smart City, the researchers analysed the spending patterns of eight U.K. cities; Leicester City Council, Manchester City Council, Leeds City Council, Portsmouth City Council, Sheffield City Council, Liverpool City Council, Bristol City Council and Coventry City Council. These are medium-sized cities with populations between 200,000 and 760,000.

This type of analysis has become much easier because of the open data movement in the U.K. which has generated significant amounts of accessible data about government spending and procurement. The analysis showed that the eight cities were spending on average 6% of their expenditure on information technology (IT). That's an average of £23 million a year on IT. Four of the eight spent between 8 and 10% of their budgets on IT. To put this in context the proportion that these city governments are spending on IT is more than many industries and is comparable to the banking and financial services industry which spends on average 8% of operating expenditure on IT.

This research shows that cities are already investing a significant amount on something which is a foundation for smart city technology. IT already underpins many services offered by city governments today including public security and health, transportation, public works, natural resource management, and permitting and licensing.

Other research also shows that city governments globally are spending a significant proportion of their budgets on IT. Gartner analysed the IT spending patterns of 99 local governments across 80 countries in the U.S. and found that that IT accounted for 3.8% of their total operating expenses. U.S. local governments, including 3,200 counties and 19,000 cities, spent approximately $34 billion on traditional IT goods and services in 2013.

According to Gartner national and regional governments in the Middle East and North Africa will spend US $12.2 billion on IT products and services in 2014, up 1.3 % from 2013. This includes internal services, software, IT services, data center, devices and telecom services. Saudi Arabia is investing in various digital government initiatives including King Abdullah Economic City (KAEC). The United Arab Emirates (UAE) is planning multiple smart cities.

City governments are not the only organizations providing IT services to support city operations. For example, installation of 5,000 smart meters in homes and businesses across London involved investment from private companies including EDF Energy, Siemens, Logica, as well as the electricity transmission and network operators, National Grid and U.K. Power Networks, the transport operator Transport for London, and the city government Greater London Authority.

The U.S. spending analysis also showed that local governments, in addition to more traditional IT products and services, are procuring cloud-based services to modernize citizen services and reduce operational costs. An example is online portals for tax collection and business licensing. These IT projects are usually part of wider modernization projects being carried out within departments rather than standalone technology initiatives at the city level.

UK city governments buy IT products and services from large vendors

City governments in the U.K. tend to purchase their IT products and services from large vendors. According to the UK analysis of eight cities large and middle-sized vendors accounted for the overwhelming majority of IT spending (98%) with small businesses only accounting for 2%.

January 27, 2014

January 6 KamerMaker XL, a 3D printer for printing rooms, was moved to the construction site of the 3D Print Canal House in Amsterdam. The printer is using a type of plastic 80% bio-based hotmelt developed by a German chemical company.

According to DAS who developed the printer "Each room is printed separately on site before being assembled into one house. This way the rooms can be carefully tested in a safe and easy accessible manner. Each room is different and consists of complex and tailormade architecture and unique design features. The structure is scripted and this creates its proper strength but also generates ornament, and allows for new types of smart features, such as angled shading scripted to the exact solar angle. Each printed room consists of several parts, which are joined together as large Lego-like blocks. Both the outside façade as the interior are printed at once, in one element. Within the 3D printed walls are spares for connecting construction, cables, pipes, communication technique, wiring etc."

Each room is structurally independent of the others. After the rooms are printed, they are assembled into connected floors, and then the floors are stacked to form the entire house. The printing of the home and its assembly is projected to take about three years. The grand opening of the construction site will be on Sunday March 2, 2014 in Amsterdam.