USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

October 02, 2018

Waze, now part of Google, is a crowd sourced handheld app used by 100 million drivers worldwide. It is supported by 500,000 volunteer map editors and 600 partners. Its connected partners program includes 40% of top cities and 60 % of state departments of transportation (DoTs). It is very interesting that Waze has developed a way for governments to report planned and actual road construction which can inform drivers in planning routes ahead of time. This offers a way to provide a single database that could be used to avoid the problem of the same stretch of road being dug up multiple times by different utilities sometimes with a few weeks of each other.

Several years ago I blogged about the online web-based system Koordination im öffentlichen Raum (Spatial, temporal and financial co-ordination of city planning, traffic change, and construction projects on public land) of the City of Bern, Switzerland. This is a system for coordinating street works that achieves some of the same goals as the 2004 Traffic Management Act in the U.K. It is location aware and enables planners to avoid serial excavations of the same stretch of street or road by coordinating all construction from multiple agencies including Federal, Kantonal and city street and highway agencies, water/wastewater, gas, and electricity utilities, and telecommunications companies so that a section of street or road is not dug up more than once every five years. In its first year of operation it was estimated that the coordination system saved SFR 7 million.

At GIS in the Rockies Saila Hanninen of Waze described some of the remarkable ways that Waze data and network are being used for purposes beyond what it was initially intended for.

During and after Hurricane Sandy in Oct 2012 the White house called Waze for help to determine which gas stations were open in the areas hit by the strom.

For the Rio de Janeiro Olympics, Waze helped to determine where to place three rapid transit lanes that resulted in a 27 % reduction in congestion.

Waze helped Ghent redesign its city centre to reduce traffic by 40 % which also resulted in fewer accidents and increased bicycle and transit usage.

Boston asked Waze to help with signal timing that resulted in 18% reduction in congestion just by changing timing.

During the Pope's visit to Philadelphia, Waze was responsible for a 20% reduction in congestion.

40% of accidents are reported faster on waze which results in an average of 5 minutes faster response time.

Waze intends to get more proactive. It now has a crisis response team, that can push messages to Waze users in areas affected by a disaster to collect information about road closures, the state of the roads, which gas stations are open, shelter availability, and safe routes to shelter.

Waze has a vision to eliminate traffic. Altogether. To do that requires not only information about the current state of congestion on roads, but also anything that could potentially affect traffic on roads in the future. Knowing when road and major excavations near roads are planned is essential. If the Waze partner program included everyone planning excavations on or near roads, it would have developed a database that would provide all the information required to coordinate excavations by utilities, telecoms, and governments. Similarly to what happened in Bern, this could mean that roads would not be dug up more often than once in five years which would reduce traffic congestion and save money.

March 05, 2017

Nearly half of the world’s population is at risk of malaria. In 2015, there were roughly 212 million malaria cases and an estimated 429 000 malaria deaths.

On Thursday, March 3rd 2016, a world-record humanitarian mapathon took place at Politecnico di Milano in northern Italy. Two-hundred and twelve 10-year old children from nine classes at six elementary schools in Milan province had the unique opportunity of meeting with researchers from GEOlab (Geomatics and Earth Observation laboratory) and HOClab (Hypermedia Open Center laboratory) of Politecnico di Milano, who introduced them to humanitarian mapping. In cooperation with Humanitarian OpenStreetMap Team (HOT) and Missing Maps, the team led by Prof. Maria Antonia Brovelli, Dr. Marco Minghini, Dr. Monia Molinari and Dr. Aldo Torrebruno showed children how to map buildings in the northernmost part of Swaziland in a project for malaria elimination. The results of the mapathon have been incredible: more than 40,000 edits to the map and more than 5,000 buildings mapped in only a few hours of work. Knowing that their maps will be used to help local people in Swaziland, children were simply fantastic in performing the mapping task in a unique, passionate, and enthusiastic way. Some of the young mappers did not want to return to class because they had remaining buildings to map in their task areas.

Humanitarian OpenStreetMap Team is calling all mappers to help HOT and our partners eliminate malaria with its biggest project to date. HOT is working on two projects, mapping over 500,000 square kilometers in 7 countries. This project collective is part of the Missing Maps initiative in conjunction with DigitalGlobe, the Clinton Health Access Initiative, the Gates Foundation, PATH and MapBox.

February 01, 2017

Everytime I attend Distributech I come across something that simply knocks my socks off. Alan McMorran of Open Grid Systems has been working with Scottish and Southern Electricity Networks (SSEN) and Electricity North West (ENW) on a "Grid Reporter" application that crowdsources information from users with smart phones about outages.

When someone encounters a line down, no power, or damaged equipment situation, they can report the problem from a smart phone with a minimum of hassle - no logging in or entering a location or address. The application uses the GPS on the smart phone to determine location. It allows the person reporting the problem to take a picture. It is able to record which direction (heading) the phone was directed when the picture was taken.

On the server side the application takes the location and heading and does a geospatial lookup to identify nearby candidates for the specific piece or pieces of equipment the user has identified. If the user's phone is on a wireless network, it will even suggest which facilities may be the problem and request clarification from the user. Based on the information provided by the user and the user's smart phone, the application generates an outage report with the details including the specific CIM data element needed by the utility's outage folks to respond. The server-side requires less than 100 milliseconds of processing to respond to the user.

The server side application resides in the cloud and is scalable so that if this is a storm situation and many calls are being fielded, it can rapidly scale up the necessary computing resources.

Another feature is that it is able to push outage status messages to the users reporting problem so that they are kept aware of any danger and progress in resolving the problem.

This strikes me as a way to dramatically improve the collection of outage information by broadening the user base of potential reporters at very low cost. It also enhances the customer-utility interaction and because it provides feedback it encourages customers and other users to report problems. In the UK there is another important driver and that is because of the nature of how the electric power industry is organized. Customers often do not know who to call when they encounter a problem, the network operator or the retailer they buy power from. With this application they don't need to know because the application automatically connects to the right folks.

From an implementation point of view the server-side application assumes that it has access to a network model that uses the Common Information Model (CIM) electric power distribution model standard. Any utility that has a network model that is CIM-compliant should be able to integrate the application without a great deal of difficulty.

December 30, 2016

Earlier this year on a Sunday renewable power supplied almost all of Germany’s power demand, a major milestone in Germany's “Energiewende” policy. The transformation of the grid (called by some a revolution) is being driven by distributed energy (DER) and smart devices. It requires profound changes to how we monitor and manage the power grid and to be able to understand the impact of the integration of more distributed intermittent sources and new consumer, industrial, and commercial electronics requires data and tools that only exist partially today.

If you are familiar with OpenStreetMap (OSM), then the OpenGridMap project is not that different except that it focusses on mapping electric power infrastructure instead of transportation. I had heard rumours about an OSM-like project in Germany to map electric power infrastructure several years ago, but was unable to find more information about it. OpenGridMap is an new open community that crowdsources power grid geospatial location and related data to be used for research purposes. The goal is to collect reliable grid data to enable researchers to conduct simulation studies. OpenGridMap is intended to support the entire process from data collection to making grid data available for various applications. An OpenGridMap app for Android for people interested in helping to collect grid infrastructure data is available on Playstore.

Using OpenGridMap data analytics developed at the Technical University (TU) Munich generates a power grid model in the form of a Common Information Model (CIM) description file which provides the basis for a power grid simulation model. Experiments with OpenGridMap data have demonstrated the effectiveness of this approach for grid modeling and simulation. It is expected that as more data is collected it will be possible to generate power grid simulations of larger size, more variety and more accuracy than the currently available state-of-the-art test power grids.

The project was initiated by Professor Hans-Arno Jacobsen, Chair of Business Information Systems, and José Rivera at the Technical University of Munich. It is supported by a German Federal Ministry of Education and Research grant and the Alexander von Humbolt Foundation. Collaborators include Siemens AG, World Bank's ESMAP, the Technical University Berlin's VEREDELE project and Universität Erlangen-Nürnberg (FAU) KOSiNeK project.

April 24, 2015

A new open journal Open Geospatial Data, Software and Standards provides an advanced forum for the science and technology of open data, crowdsourced information, and sensor web through the publication of reviews and regular research papers. The journal publishes articles that address issues related, but not limited to, the analysis and processing of open geo-data, standardization and interoperability of open geo-data and services, as well as applications based on open geo-data. The journal is also meant to be a space for theories, methods and applications related to crowdsourcing, volunteered geographic information, as well as sensor web and related topics.

March 11, 2015

INSPIRE-Geospatial World Forum 2015, a joint conference organized by the European Commission and Geospatial Media and Communications, has made available its full conference program. Almost 500 presentations are scheduled on topics including building information modelling (BIM), open data, big data analytics, open standards, linked data, cloud computing, crowdsourcing, Earth observation, indoor positioning, land information systems for smart cities, urban resilience and sustainability, health, agriculture and others. Some 2000 delegates are expected to attend from more than 80 countries. Top sponsors include Trimble, Topcon, ESRI, Digital Globe, Oracle and Bentley.

The theme of the conference is CONVERGENCE: Policies + Practices + Processes via PPP with a focus on improving coordination among policy-makers, technology providers and users. Including geospatial data and technology in construction, agriculture, health and other industry workflows is an enabler for more successful public–private partnerships (PPP) by facilitating more informed decision making among the stakeholders.

January 19, 2015

For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.

Most utilities are in the midst of deploying smart grids, which basically amounts to applying the internet to the electric power grid to link intelligent electronic devices, sensors and grid control applications to enable data-driven decision making. One of the most important changes driven by the implementation of smart grid is the much greater importance of location. Geospatial technology (location, geospatial data management and spatial analytics) is seen as foundational techology for the smart grid.

The other major global change in energy is the shift in energy demand from the world's advanced economies to emerging economies. Energy demand from OECD countries has hit a plateau. Currently China is driving world energy demand. The International Energy Agency's (IEA) World Energy Outlook 2014 projects that in the future as demand slows from China, world energy demand will be driven by India, the Middle East, and Africa and Latin America.

Recently IDC Energy Insights released a report IDC FutureScape: Worldwide Utilities 2015 Predictions with predictions for the future of the utility business. Some of these are startling, suggesting that the utility industry is going to experience fundamental changes in how they do business over the next few years.

New business models

IDC predicts that utilities will be looking less at generation as a source of revenue. IDC predicts that by 2018 45% of new data traffic in utilities' control systems will originate from distributed energy resources that are not owned by the utility.

To make up for this loss of generation revenue IDC predicts that utilities will be looking for new business opportunities such as services. Specifically, IDC predicts that utilities will derive at least 40% of their earnings from new business models by 2017.

Technology

Cloud - By 2018 cloud services will make up half of the IT portfolio for over 60% of utilities.

Integration - In 2015 utilities will invest over a quarter of their IT budgets on integrating new technologies with legacy enterprise systems.

Analytics - By 2017 45% of utilities' new investment in analytics will be used in operations and maintenance of plant and network infrastructure.

Mobility - 60% of utilities will focus on transitioning enterprise mobility to capitalize on the consumer mobility wave.

Some of the important drivers for these trends include the global redistribution of energy demand from the world's advanced to the emerging economies, the rapid emergence of cloud-based provisioning and services, increasing regulatory pressure responding to customer demand to improve energy market transparency and competitiveness, cross-industry competition for technical, especially IT skills, smart analytics, and virtual and augmented reality beginning to be applied in business.

Top 10 technology trends

In March 2014 Gartner, Inc. identified the top ten technology trends which it saw impacting the global energy and utility markets. There is considerable overlap between IDC's business predictions and the technology trends identified by Gartner, Inc.

Social Media

Social media are beginning to be used as a customer acquisition and retention medium, as a consumer engagement channel to drive customer participation in energy efficiency programs, a source of information about outages, and as the emerging area of crowd-sourcing distributed energy resources coordination. Social media are also being used by utilities for communicating information about outages with customers.

Big Data

Smart grid will increase the quantity of data that utilities have to manage by a factor of about 10,000 according to a recent estimate. This trend is driven by intelligent devices, sensors, social networks, and new IT and OT applications such as advanced metering infrastructure (AMI), synchrophasors, smart appliances, microgrids, advanced distribution management, remote asset monitoring, and self-healing networks. The type of data that utilities will need to manage will change: for example, real-time data from sensors and intelligent devices including smart phones and unstructured data from social networks will play a much greater role for utilities in the future.

Mobile and Location-Aware Technology

Mobile and location-aware technology which includes hardware (laptops and smartphones), communication products (GPS-based navigation, routing and tracking technologies), social networks (Twitter,Facebook and others) and services (WiFi, satellites, and packet switched networks) are transforming all industries. Utilities for the most part have been slow to adopt consumer mobile technology, but this is changing.

Cloud Computing

Acccording to Gartner, areas such as smart meter, big data analytics, demand response coordination and GIS are driving utilities to adopt cloud-based solutions. Early adopters of cloud technologies include small utilities with limited in-house IT skills and budgets, organizations which provide application and data services to multiple utilities, such as cooperative associations and transmission system operators, and investor-owned utilities (IoUs) conducting short-term smart grid pilots.

Sensor Technology

Sensors, which are being applied extensively throughout the entire supply, transmission and distribution domains of utilities, provide a stream of real-time information from which a real-time state of the grid can be derived.

IT and OT Convergence

Virtually all new technology projects in utilities will require a combination of IT and OT investment and planning, such as AMI or advanced distribution management systems (ADMSs). This will be a challenge for many utilities, especially smaller ones, which don't have in-house IT skills.

Sensors and actuators embedded in physical objects are linked through wired and wireless networks, using the same Internet Protocol (IP) that connects the Internet. When intelligent objects can both sense the environment and communicate, they become tools for understanding utility grids and responding to changes in near real-time. Following McKinsey there are two key benefits arising from the Internet of Things for utilities

Traditional asset management approaches are too limiting for today’s performance-based, data-driven utility environment. Asset performance management solutions need to deliver real-time equipment performance, reliability, maintenance and decision support for effective resource management so that operations and maintenance teams are empowered with real-time decision support information, providing the right information to the right people at the right time and in the right context. The result is improved operational performance and better asset availability and utilization.

Business Intelligence and Advanced Analytics

Analytics will become essential as the volume of data generated by intelligent devices and sensors, mobile devices (the Internet of Things) and social media increases and huge pools of structured and unstructured data need to be analyzed to extract actionable information. Analytics will become embedded everywhere, often invisibly.

May 01, 2014

Since Google Earth's release in June 2005, national mapping agencies (NMCAs) and other government organizations involved with geospatial data and technology have been reassessing their role. At the Geospatial World Forum (GWF) in Amsterdam two years ago, Paul Cheung of the United Nations Initiative on Global Geospatial Information Management (GGIM) voiced the concern about their role that many national mapping agencies and other government organizations with responsibility for geospatial information have had since the advent of Google Earth, private data companies like Digital Globe, crowd-sourced geospatial data like OpenStreetMap, and open data policies adopted by many governments around the world.

The challenge for NMCAs is that in parts of the world, Google. Bing, OpenStreetMaps, or a local crowd-sourced alternative like Malsingmaps provides more comprehensive, accurate, current and accessible data with less restrictive licensing and at a lower cost than the NMCA. A report from the GGIM argues that "governments have a key role to play in bringing all actors together to ensure that our future society is a sustainable, location-enabled one, underpinned by the sustainable provision and effective management of reliable and trusted geospatial information." In other words governments and NMCAs have a unique critical role in developing and maintaining authoritative national databases. To improve the quality and timeliness of these authoritative databases many have argued that NMCAs need to find ways to incorporate volunteered or crowdsourced geospatial data in these databases.

In the United States the National Map is a significant contribution to the National Spatial Data Infrastructure (NSDI). It is a collaborative effort among the US Geological Survey (USGS) and other Federal, State, and local partners to deliver topographic information for purposes that range from recreation to scientific analysis to emergency response. The geographic information available from The National Map includes orthoimagery, elevation, geographic names, hydrography, boundaries, transportation, structures, and land cover.

In addition to being an important contribution to the NSDI, the National Map is a foundation for the Department of the Interior (DOI) Geospatial Modernization Blueprint and its mission to protect America's treasures for future generations, provide access to America's natural and cultural heritage, offer recreation opportunities, honor trust responsibilities, conduct scientific research, provide wise stewardship of energy and mineral resources, foster sound use of land and water resources, and conserve and protect fish and wildlife.

The USGS's National Map Corps (TNMCorps) was initated a year ago as a nationwide program with the goal of using volunteered or crowdsourced data for the National Map. TNMCorps' Volunteered Geographic Information (VGI) project engages citizen scientists to collect manmade structures data including: schools, hospitals, post offices, police stations and other important public buildings. In the past year, civilian volunteers in 50 U.S. states have begun providing accurate geospatial data to the The National Map,

October 02, 2013

One of the key enabling technologies introduced in the early days of AM/FM/GIS systems was long transactions, also known as data versioning. Long transactions make it possible to store spatial design data in the same database as as-built data, enabling design engineers to work on new designs at the same time as operations staff are relying on as-built information to operate and maintain active utility networks.

The first commercial systems implementing this technology in the early 1990's were IBM's GFIS which ran on DB2, Intergraph FRAMME, and VISION* which managed versioned spatial data in an Oracle RDBMS (this was long before Oracle Spatial and Workspace Manager). As I remember, San Diego Gas & Electric and BC Hydro were GFIS sites. The largest FRAMME site was Ameritech, now part of AT&T. The largest VISION^ sites, which are still running and supporting about a thousand concurrent designers, are US West, now CenturyLink, and Telesp, now Telefonica Sao Paulo.

Smallworld implemented long transactions in 1996. Since then both Oracle (Workspace Manager) and ArcSDE (now ArcGIS) have implemented support for long transactions.

Each of these early implementations implemented long transactions in a quite different way. For example, in some the number of versions at any given time is assumed to be small, but the number of objects in each version is large. In others the number of versions can be large, and the the number of objects per version is small.

Recently there has been a renewed interest in spatial long transactions with a focus on managing them in a distributed data environment. It is being proposed that the underlying version management platform be based on the same principles as Git which has become the de facto standard in the open source community. Git was originally developed by Linus Torvalds for source code management for Linux. Google, Facebook, Microsoft, twitter, Linked In, Qt, Gnome, and Android are some of the folks using Git. Millions of open source projects are hosted on GitHub. The GeoGit project is an attempt to apply a Git inspired version management system to spatial data in a distributed environment. Currently, users can import raw geospatial data as Shapefiles, PostGIS or SpatiaLite into a repository where any changes to the data are tracked. These changes can be viewed in a history, reverted to older versions, branched in to sand boxed areas, merged back in, and pushed to remote repositories. GeoGit, still in early development or alpha mode, is written in Java, and is available under the BSD License.

One of the use cases GeoGit is intended to support is a hybrid data management process involving an authoritative database that can be updated by crowdsourced data. It is being proposed that with distributed versioning, it is possible to have the best of the crowdsourced (e,g,, OpenStreetMap) and authoritative (e,g,, TIGER) worlds. "Data stewards could maintain their own data repositories for their particular areas of interest. They could also apply the same quality assurance checks as the central authority, which would enable their data to be easily pulled in to the central database. However, the central authority need not be involved in order for partnering stewards to exchange updates with one another. These inter-steward exchanges would also facilitate quicker updating to the central authority, because there would be fewer conflicts during the merging of the updates the various stewards submit."

September 18, 2013

At this year’s AGI GeoCommunity '13 Conference in Nottingham, Iain Langlands, GIS Manager, Glasgow City Council gave an overview of the £24million Future City / Glasgow program which will demonstrate how technology can make life in the city smarter, safer and more sustainable.

Glasgow had to compete with 29 other cities to win the Technology Strategy Board's ‘Future Cities Demonstrator'. "The city will demonstrate how providing new integrated services across health, transport, energy and public safety can improve the local economy and increase the quality of life of Glasgow's citizens, and will allow UK businesses to test new solutions that can be exported around the globe." The Technology Strategy Board is the the innovation agency of the UK National Government.

The city has started a program that will build a new operations centre, establish a city technology platform and develop innovative demonstrator projects that address across four themes; public safety, transport health and energy.

For example, on the theme of travel/transport, there are initiatives aimed at reducing conjestion by route optimization and other technologies. On the energy theme, there is an initiative focussed on intelligent buildings, samrt buildings that can adjust its own lighting and heating based on analytics using a variety of sensor data. In the area oh health there is a program to improve social mobility. 70% of Glasgow's neighbourhoods are below the national norm for social deprivation and this program is designed to help address this issue. In the area of public safety, there are initiatives involvign closed circuit TV for monitoring streets and public areas and intelligent street lighting.

One of the "motherhoods" of the Glasgow smart city intiative is that data collected by the city will be open. This will enable intelligent operations and real-time analysis, not only by the city but by developers/entrepreneurs and is intended to promote research and the development of new businesses. The data will be available through a public portal.

Citizens are viewed as mobile sensors so that a lot of the data is crowdsourced. OpenStreetMap is the source for much of the street data. An example of an app that will be an important in empowering citizens is the MyGlasgow app that allows citizens to report problems such as potholes. A unique aspect of Glasgow's app is that it asks the citizen reporting the pothole to assess the risk the potholes represents. Rather surprisingly to some, based on the feedback they have gotten so far, the man in the street's assessment of risk is not very different from a professional road inspector's.

Empowering citizens to put all residents at the forefront of technology integration and application is key to the Future City Demonstrator, as well as collaboration between stakeholders. A small team of specialists are working on projects designed to show how technology can improve life in the city, helping to inform and connect its citizens, and help move towards a more self-reliant and sustainable society. Geospatial technology will be central to the success of the program. The geospatial platform for the city's technology appears to be primarily open source and includes MapBox, TileMill, and Quantum GIS.