USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

October 02, 2018

Waze, now part of Google, is a crowd sourced handheld app used by 100 million drivers worldwide. It is supported by 500,000 volunteer map editors and 600 partners. Its connected partners program includes 40% of top cities and 60 % of state departments of transportation (DoTs). It is very interesting that Waze has developed a way for governments to report planned and actual road construction which can inform drivers in planning routes ahead of time. This offers a way to provide a single database that could be used to avoid the problem of the same stretch of road being dug up multiple times by different utilities sometimes with a few weeks of each other.

Several years ago I blogged about the online web-based system Koordination im öffentlichen Raum (Spatial, temporal and financial co-ordination of city planning, traffic change, and construction projects on public land) of the City of Bern, Switzerland. This is a system for coordinating street works that achieves some of the same goals as the 2004 Traffic Management Act in the U.K. It is location aware and enables planners to avoid serial excavations of the same stretch of street or road by coordinating all construction from multiple agencies including Federal, Kantonal and city street and highway agencies, water/wastewater, gas, and electricity utilities, and telecommunications companies so that a section of street or road is not dug up more than once every five years. In its first year of operation it was estimated that the coordination system saved SFR 7 million.

At GIS in the Rockies Saila Hanninen of Waze described some of the remarkable ways that Waze data and network are being used for purposes beyond what it was initially intended for.

During and after Hurricane Sandy in Oct 2012 the White house called Waze for help to determine which gas stations were open in the areas hit by the strom.

For the Rio de Janeiro Olympics, Waze helped to determine where to place three rapid transit lanes that resulted in a 27 % reduction in congestion.

Waze helped Ghent redesign its city centre to reduce traffic by 40 % which also resulted in fewer accidents and increased bicycle and transit usage.

Boston asked Waze to help with signal timing that resulted in 18% reduction in congestion just by changing timing.

During the Pope's visit to Philadelphia, Waze was responsible for a 20% reduction in congestion.

40% of accidents are reported faster on waze which results in an average of 5 minutes faster response time.

Waze intends to get more proactive. It now has a crisis response team, that can push messages to Waze users in areas affected by a disaster to collect information about road closures, the state of the roads, which gas stations are open, shelter availability, and safe routes to shelter.

Waze has a vision to eliminate traffic. Altogether. To do that requires not only information about the current state of congestion on roads, but also anything that could potentially affect traffic on roads in the future. Knowing when road and major excavations near roads are planned is essential. If the Waze partner program included everyone planning excavations on or near roads, it would have developed a database that would provide all the information required to coordinate excavations by utilities, telecoms, and governments. Similarly to what happened in Bern, this could mean that roads would not be dug up more often than once in five years which would reduce traffic congestion and save money.

June 12, 2017

At GeoBusiness 2017 in London, Hugh Boyes, a Cyber Security specialist outlined some of the security risks of digital engineering and the importance of cybersecurity for engineering projects. BIM models, floorplans and 3D LiDAR scans of sensitive buildings and infrastructure can often be found on the web. He mentioned the floorplan of a major police station, a BIM model of a young offenders facility, and a BIM model of the Victoria Underground station. All of these could potentially be very useful to anyone interested in disrupting infrastructure. It is well-known that mapping underground infrastructure reduces risk during construction and also during disasters, natural and man-made. However, it can also provide information about sensitive infrastructure. Hugh mentioned substations as particularly sensitive elements of the energy grid. With traditional centralized generation, substations are critical elements of the grid and are particularly vulnerable. In many countries disrupting a small number of large substations can leave significant parts of the country without power. (An advantage of a decentralized grid comprised of many microgrids with their own generating and storage facilities is that it is much less vulnerable.)

For people who are designing and building sensitive infrastructure using BIM, he recommended following the PAS1192-5 guidelines and incorporating security into the design from the start. PAS 1192-5 specifies processes which will assist organizations in identifying and implementing measures to reduce the risk of loss or disclosure of information which could impact the safety and security of personnel and users of infrastructure or the built infrastructure itself. PAS 1192-5 is applicable to any built asset or portfolio of assets which is deemed sensitive. It is for use by asset owners and organizations involved in the design, construction, maintenance and management of built assets, especially those who wish to protect their commercial information and/or intellectual property. Hugh is the joint technical author of BS PAS1192-5 - Specification for security-minded building information modelling, digital built environments and smart asset management.

January 31, 2014

Utilities have been using geospatial data and technology for many years for mapping and asset management. But GIS is now being used in many other areas from network design to marketing to outage management to energy conservation. With the rise of the smart grid utilities are combining GIS and AMI to monitor transformer loading, identify and reduce theft and for disaster management. Many people in the industry expect that under the impetus of the smart grid the expanding role of geospatial technology will accelerate and that geospatial will become a foundational technology for the electric power industry. At Distributech, Danny Petrecca, Director of Geospatial Product Management at Schneider Electric made the case that the role of utility GIS is touching every aspect of a utilties business, affecting customers, operations and management.

Disaster management

As an example Danny gave an overview of how GIS, weather data, and the integration of weather data and GIS are helping utilities prepare for and deal with extreme weather events. The frequency and severity of extreme weather events is increasing so disaster management has become a priority with most utilities. With GIS and weather services utilities are able to plan crew resources and network assets, reduce restoration time and costs, ensure crew and public safety, optimize damage assessment, and keep the public informed during storms.

Hurricane Sandy severely impacted New Jersey Natural Gas (NJNG) infrastructure, 30,000 homes and over 275 miles of main were damaged. NJNG found that the biggest reason that they were able to restore service so quickly was their ability to analyze and manage the restoration process using their utility GIS system. That they had been using GIS for a decade prior to Sandy meant that they were better prepared for such an extreme weather event than they had expected. The GIS enabled engineers to view the network in the context of the damage caused by the storm and it enabled them to identify critical valves and shut off gas service in a matter of hours. It helped them restore over 255 miles of pipeline in under two months. It helped them resectionalize their network into smaller sections which they were able to repressurize in one day. A very important benefit of the GIS was that NJNG were able to provide timely and accurate information to the public on the status of the network including estimated time of service restoration.

Weather services

Weather services can provide information, (historical, current and forecasts), about lightning strikes, wind, temperature and humidity, ice and snow, and storms. They can predict precisely when and where a storm will hit in a utility's service territory. This enables a utility to schedule crews on-call, holdover, and dispatching and the type of equipment that will be required.

Before a storm weather services can provide Before:high quality forecasts, expert meteorological consultation, and real time alerting. When a storm hits, weather services provide real-time lightning strike information (where and when), nowcasting tools, and current situational awareness. And after the storm weather services can help with post-storm inspection and assessment. Weather data is available in digital form and some provide it as a web service compliant with the Open Geospatial Consortium's (OGC) web services standards such as WMS.

Integrated utility GIS and weather services

By integrating weather services and utility GIS allows you to do even more interesting things, like quantitatively monitoring storm impact on infrastructure like transformers in real-time, using geofences to setup alerts for lightning strikes, and correlating storm and outage events to be able to predict where issues can be expected as a storm moves across the utility' service territory.

December 27, 2013

One of the industries where geospatial data and technology is having a major impact is insurance.

I blogged previously about a ground breaking risk assessment study of sea level rise in North Carolina. The study modeled the financial impact on coast communities of the rise in sea level. resulting from global warming. It makes it possible to estimate at the community level the financial losses associated with both long term sea level rise and episodic events such as storm surges.

More recently AECOM has been sponsored by the Federal Emergency Management Agency (FEMA) to analyze the potential long-term implications of climate change on the National Flood Insurance Program (NFIP) in the U.S. The motivation for this study was a bill passed by Congress, the Flood Insurance Reform Act of 2012, which mandates a number of changes to the NFIP. Key provisions of the legislation require the NFIP to raise rates to reflect true flood risk and to make the program more financially stable.

If a landowner's property is included in a Special Flood Hazard Area (SFHA) as defined by NFIP maps, it is mandatory for the landowner to have flood insurance. The SFHA is the area what would be flooded by a "100-year flood", or in other words the flood having a one percent chance of being equaled or exceeded in any given year. NFIP maps are available on-line and as paper and digital documents.

Projected impact of climate change

AECOM found that near rivers by the year 2100, the SFHA is projected on average to increase about 45% across the nation, with very wide regional variability.

In coastal environments AECOM projected changes based on two alternative assumptions. In the fixed shoreline case, it is assumed that the shoreline does not change under the impact of rising sea level. The alternative assumption is the receding shoreline assumption, that the shoreline recedes as sea level rises causing land to be lost to the sea. AECOM found that

assuming a fixed shoreline the typical increase in SFHA is projected to be about 55% by the year 2100, also with very wide regional variability.

with the receding shoreline assumption, negligible change in coastal SFHA is projected because the amount of new coastal SFHA resulting from rising sea levels will be equally offset by the land area lost to the sea.

AECOM estimated that the national average increase in SFHA by the year 2100 is 40% near rivers and coastal areas if shoreline recession is assumed; and 45% near rivers and coastal areas if fixed coastlines are assumed.

Impact with the receding coastline assumption

The estimated economic impact by 2100 is that the total number landowners required to have NFIP insurance policies will increase by approximately 80%. This breaks down into a 100% increase in landowners near rivers required to have NFIP policies and approximately a 60% increase in landowners in coastal environments.

AECOM also estimated that the average loss cost per policy will increase approximately 50% by the year 2100.

Impact with the fixed shoreline assumption

The estimated economic impact by 2100 is that the total number landowners required to have NFIP insurance policies is projected to increase by approximately 100%. This breaks down into a 80% increase in landowners near rivers required to have NFIP policies and approximately a 130% increase in landowners in coastal environments.

The estimated average loss cost per policy will increase by approximately 90% by the year 2100.

A probabilistic approach was used in which a range of climate changes was considered. The key climate factors include both the frequency and the intensity of storms that influence flooding. Near rivers, intensity is determined by the amount of rainfall during storms. In coastal areas, storm intensity is determined by wind velocity and pressures that produce storm surges.

The engineering analyses were based on IPCC AR4's three greenhouse gas emissions scenarios. These scenarios represent a balanced range between low and high climate change impact.

The A1B scenario assumes balanced energy use between fossil fuel and non-fossil fuel energy sources, high economic growth and low population growth.

The A2 scenario assumes a smaller amount of economic growth, but more rapid population change.

The B1 scenario assumes low population growth and a more environmentally sustainable approach to economic growth.

River Flooding – changes in precipitation frequency and quantity as storms become more or less frequent, and more or less intense. River floods also depend upon the rate of runoff from a watershed. Urbanization is an important factor because it increases the proportion of impervious surface area.

In coastal regions the important factors are gradual sea level rise (SLR) and the effects of storms. Rising sea level increases the likelihood of greater long-term shoreline erosion and recession. This is important because the projected effect is to move the coastal SFHA substantially inland by 2100. Depending upon the emissions scenario being considered, the global rise in sea level at 2100 is projected to average approximately 1.2 meters over the three emissions scenarios.

The engineering analysis was followed by a demographic analysis to determine the projected population, number of insurance policies, and related factors within flood hazard areas through 2100.

Special Achievements in GIS award went to the Dartmouth Atlas for showing geographically how outcomes and the cost of medical procedures varies depending on location across the U.S.

The Hong Kong Lands Department received the Enterprise GIS award for their land management system which they started in the 1990's and is now in its second generation.

Tbe President's Award went to Direct Relief which distributes different types of equipment to people around the globe in need with 99% efficiency according to Forbes.

Themes

The theme of this UC was GIS transforming our world, not only changing the physical world as we know it, but also changing our perception of the world and how GIS has and will fundamentally help us collectively build a better future. A classic example that has impacted everyone on the planet is GPS or more generally GNSS. As a number of speakers including Dangermond said we are never lost any more and that is fundamental change to how we perceive the world.

Some of the major themes that emerged at this UC are common to many
software and service providers in the IT sector, but some of these are
new or unique to ESRI.

Climate change

We are facing serious global challenges, in particular climate change, which both at the Geodesign Summit and at this UC conference appears to be a personal challenge for Dangermond who sees GIS as contributing in major way in creating a more sustainable future. He sees GIS helping to change how we think and act and how we do basic things such as design.

GIS across the organization

A theme that was pervasive throughout the conference including in the utility sessions was the need to make GIS pervasive across the organization because 90% of the data that organizations are collecting and managing has location. This was also true in utility sessions I attended, where the thrust is to expand the use of GIS outside of the traditional area of records management into other departments.

Vertical industry "templates"

In the utility and telecommunications sectors there is a major effort underway to create preconfigured vertical "templates" for specific vertical industries. In the utility sector one was announced yesterday for the electric power Industry, and others are underway for gas and telecommunications. These templates are a collection of data and tools for vertical industries and are available as open source on Github.

Web

The web was pervasive, It was hard to find a session where the theme that desktop GIS is being transformed into web GIS did not come up. It doesn't mean that the desktop is going to go away in the near future, but sharing data, applications, and apps is being recognized as a common goal and web GIS makes this very easy to do. This includes web portals for accessing enterprise data over the web.

Cloud

A lot of the traditonal desktop GIS capabilities are now being made available in the cloud so that all you need is an iOS or Android device or a web browser. This is also available to large enterprises who need to keep their data on their own servers for security reasons. Cindy Salas from Centerpoint Energy demonstrated ESRI's cloud capability but running "on premise" completely wthin Centerpoint's fire wall.

Integration

Being able to integrate location and the spatial dimension into enterprise applications such as ERP (SAP), business intelliegence (IBM Cognos), Salesforce, MS Dynamics, and Excel spreadsheets is a new focus for ESRI. They have already created plugins for several well-known applications.

Mobile devices

In the past when you talked about mobile in the context of utilities, you were normally dealing with Panasonic Toughbooks which could cost up to $6 000 each. At this conference just about every application was able to run on a variety of low cost mobile devices including iOS and Android mobile devices. One of the sessions I attended was about the National Broadband rollout in Australia where the 500 field folks who are doing 300 000 pit inspections were all equipped with iPads, which you might have thought were too fragile for pretty rough field work.. This project has been underway for something like 18 months and todate only two of the iPads have been damaged.

Big data

Spatial data volumes whether from sensors at utilities, from LIDAR scanning of transmission lines, or from earth obsevration satellited are now reckoned in terabytes. There were sessions on spatially enabling Hadoop, SAP HANA and other big data management systems for managing these data volumes.

Real-time

In the utility track managing real-tme data from sensors on transformers, smart meters, phasors, and other devices and enabling near real-time decision making was a common theme. A general event processing capability that can be triggered by rules has been added to the quiver to make this possible for any data stream.

In the area of imagery, Digital Globe imagery can be accessed with 4-5 hours of acquisition.

Analytics

Big data and real-time means that we need to be able to understand what these huge volumes of data are telling us and then enable decision-making, automated or human mediated, as rapidly as possible. If a transformer is running hot, we can't wait a week until the next service check to reconfigure the network because by then we may have reduced it projected lifetime by 10%. There were "location analytics" sessions in most of the vertical industry sessions including utilities.

Content

Software companies like ESRI and Hexagon are now in the content busines. And with the data beng made available in the cloud, you can access it through web GIS or desktop GIS. The data includes traditional topographic maps, digital terrain models, imagery from earth observation satellites from Digital Globe and others some of it near real-time (fr example, from Digital Globe within 4-5 hours of acqusition), demographics, and others.

Dangermond's term for this is the "living atlas" because this body of content is intended to be dynamic, real-rime and comprehensive. The goal seems to be all the data you would need to manage the planet.

Some of the content explicitly mentioned includes

basemaps

community. content

imagery

30 cm coverage of the US

60 cm coverage of Western Europe

imagery for the middle east and asia in total covering 2/3 of the world

This data is accessible to web and desktop applications and makes it very easy to do standard spatial analytical things like multi-criteria suitability or site selection mapping. Users can also publish their own data to the cloud.

Education

This has always been a strong focus area for ESRI and from Dangermond's perspective with the challenges the world is facing, GIS professionals will be even more critical in the future. There is focus on advancing spatial literacy to a broader audience than in the past.

LiDAR

Laser scanning is a major source of important data for the construction industry (right of way determination and construction progress monitoring), utilities (transmission vegetation management), and other industries. ESRI has made this a focus area and there were a number of sessions devoted to managing and analyzing point cloud data.

3D

Dangermond specifically singled out 3D as a major capability area for ESRI. There are specialized 3D products, but 3D is also being built into ESRI main stream GIS and spatial analytical products.

3D city models

At the plenary there was a demonstration of how to quickly build a 3D city model from 2D data, including extruding buildings from a 3D building footprint and height, adding textures depending on building types, and then tools for analysis such as zoning and shadow analysis and visualization including flythroughs. These models are sharable on the cloud as web scenes requiring only a web browser (with no plugins) or iOS mobile devices.

Geodesign

At the UC ESRI is demonstrating a prototype of a product designed specifically for geodesign.

Open data

Dangermond specifically mentioned the newest national open data initiative in Peru.

Crowdsourcing/volunteered data

There are sessions specifically on editing and manipulating OpenStreetMap data.

March 25, 2013

At the Geospatial World Forum (GWF) in Amsterdam last year, Paul Cheung of the United Nations Initiative on Global Geospatial Information Management
(GGIM) voiced the concern about their role that many national mapping
agencies and other government organizations with responsibility for
geospatial information have had since the advent of Google Map/Earth,
private data companies like Digital Globe, Geoeye, TeleAtlas and Navteq,
crowd-sourced geospatial data like OpenStreetMap, and open data policies adopted by many governments around the world.

The objective of the UN GGIM initiative is to help government fit into the
new world of Google Map/Earth and the rest of the private geospatial
industry. Based on the assumption that neither the private sector nor
government can do it all, the GGIM has a process in place to define the
role of government in managing national geospatial information. It has
asked geospatial practitioners around the world to provide guidance on
where they see the geospatial industry headed and their views on the
roles of government and the private sector.

The Forum bought together 350 participants from 60 Member States, international organizations, the private sector, and UN entities. The discussions centered on building national geospatial information systems, an examination of the technological trends impacting the future of geospatial information management, the challenges experienced and the solutions used to develop geospatial reference data sets, the need to use geospatial information to address sustainable development issues, and UN-GGIM's leadership role in ensuring global geodetic frameworks.

a sustained operational global geodetic reference frame and infrastructure to support the increasing demand for positioning and monitoring applications

the greater use of geospatial information in sustainable development by supporting the the Global Map for Sustainable Development (GM4SD) with an initial focus on managing risks of natural disasters to urban populations

an agreed set of authoritative core global reference datasets to support global sustainable development activities

a stable, credible, and reliable national geospatial information
infrastructure in each country built on internationally recognized
standards

more training programs related to geospatial information management at all levels

regional collaboration in the promotion and development of geospatial information management

In the past the focus for the
Department of Energy and for utiltiies has tended to be on individual technologies such as smart meters and AMI, self-healng networks, synchrophasars, substation automation, and so on. Ms Hoffman's message is that we need to start focussing on the smart grid as a whole and in particular we need to focus on grid resiliency. Things will happen, floods, tornadoes, hurricanes, derechos and other natural and man-made problems. We need to ensure that the grid is resilient so that we can recover from disasters quickly. The example she gacve was one that I have blogged about, Southern Co.'s use of smart meters to track outages caused by tornadoes in April, 2011 and then help with restoration.

Ms Hoffman then discussed specific areas where there are opportunities to make the grid more resilient.

Transmission operations

There is a lot of data that is becoming available, primarily because about 1000 synchrophasars have been deployed over the national transmission grids. What are needed are improved interoperability, analytic, and visualization to better manage the transmission grid. For example, it took almost a year to analyze the data and determine the cause of the 2003 Northeast blackout. The 2011 San Diego blackout required about 3-4 months, so fault analysis is getting better, but we need to be able to use the available data to predict the state of the transmission grid in real-time.

Distribution operations

In the area of electricity distribution, Ms Hoffman singled out peak load reduction as the area where there are the greatest econimic and environmental benefits. Reducing peak load means that the construction of new power plants (peakers) that may only be used a hundred hours a year can be avoided. The other areas she identified include distribution automation, especially automated self-healing networks using devices such as intelliRupters, improved outage management, using predictive analytics to replace equipment before it fails, and microgrids.

Customer empowerment

Areas that directly impact the customer are residential diagnostics to help the customer understand how power is being used in the home and ways that it can be used more effectively, at lower cost, and as much as possible moved to off-peak. Photovoltaic cells and plug-in electric vehicles are becoming much more prevalent and utlities need to be prepared for these. Improving reliability and resiliency directly impacts customers by reducing the number and duration of outages. An area that is getting a lot attention is rate structures, for example, pricing electric power to motivate shifting load from peak to off peak. A new technology that is being investigated by DoE is transactive control signals. This is a smart grid signaling approach that communicates prices and other information, such as expected future prices, to encourage the customer to change his/her electric power usage.

Energy infrastructure cyber protection

Ms Hoffman's message on cyber protection is that it needs to be dynamic and multi-dimensional. Infrastructure is complex and changing and new vulnerabilities are detected every day. In addition multiple organizations are responsible for the grid. Effective cyber security management has to deal with a dynamic landscape of control and information systems. There are a broad range of threats and motivations including unintentional, natural phenomena, and those with malicious intent including theft, revenge, and national security. It is also important to recognize the interdependencies among critical infrastructure. A gas explosion can affect the telecommunications and electric power networks. Loss of electric power can affect many other networks including water and telecommunications. Disruption of communications, for example, severing a fiber optic cable can directly affect the electric power grid.

To address these challenges cybersecurity solutions need to be mutli-dimensional.including cyber-physical, wide area monitoring and protection, cryptography, anti-tamper devices and advanced architectures.

One of the most serious challenges that Ms Hoffman singled out specifically is developing the cybersecurity workforce that is going to be responsible for implementing these measures.

Where are we headed ?

The next major challenge is turning the smart grid into a driver of economic development. The example Ms. Hoffman used was Chatanooga. The Electric Power Board (EPB) saw the smart grid as not only benefiting the utility and improving the quality of life of its cusomers, but also as a driver of economic development. The city’s decision to deploy one of the country’s highest capacity fiber-optic networks not only improved power quality, reliability, customer service and energy efficiency but also made Chananooga more attractive to business. For instance, because bandwidth is virtually unlimited, EPB is able to offer its customers Internet upload and download speeds of up to one gigabit/sec, making Chattanooga very attractive for businesses that rely on the internet for critical aspects of their operations.

November 22, 2012

In September 2011, I blogged about one of the most amazing applications of smart meters and AMI that I had come across.

Alabama Power had deployed about 1.4 million smart meters in its
operating territory. One of the initiatives they have undertaken to take
advantage of their smart meters was to integrate their automated meter
infrastructure (AMI) system with their outage management system (OMS).
The smart meters give them immediate information about an outage, helping identify without sending a crew out whether it was a network or customer problem The integrated AMI/OMS system helps in reducing
unnecessary truck rolls.

Disaster management

But the smart meters and AMI also helped in another, perhaps from a business planning perspective, unforseen way. In April 2011 thirty tornados hit Alabama Power's
operating area completely destroying two substations, flattening
transmission pylons, breaking 7500 poles, and leaving 400,000 cutomers
without power.

By looking for smart meters that could be read the day before and
comparing them with the meters that could not be read after the
tornados, Alabama Power was able to put together a detailed picture using Google Maps of where power has been lost - all
without making telephone calls. The application could also provide
emergency response officials with infrmation about whether the power was on or off in specific
buildings - critical information that first responders require before
entering a damaged building. In addition, Alabama Power was able to track power
restoration trends as customers started coming back on-line.

Integration of AMI with geospatial and other enterprise systems

There is an interesting recent interview with Arshad Mansoor of EPRI, who makes the case that to gain the full advantage of smart grid-related systems such as AMI, geographic information systems (GIS), outage management systems (OMS), data analytics and workforce management systems, they must all be well integrated. The Alabama Power case is a very good example of the benefits of integration of AMI and geospatial technology specificially. Mansoor sees integrating AMI with a geographic information system (GIS) as the first step in integrating AMI with enterprise systems..

He says that one of the most important benefiys of inegrating AMI and geospatial technology is the ability to link the utility's service point and metered account with a customer's phyical address. Mansoor says that getting 100 % of customers correctly linked without integrating AMI with GIS is almost impossible. He estimates that most utilities maybe have 5 percent incorrectly linked. This could be even as high as ten % in some cases.

Once you can link service points to customers physical addresses and geolocation, then a whole range of other systems can be integrated with the AMI and GIS, including the outage management system (OMS), data analytics and the work management system. Together these will all contribute to reducing the restoration time after an outage.

For example, if the outage management system is integrated with the AMI and GIS, then an upstream trace from multiple failed service points to identify the common point of failure is feasible. The reverse is also feasible, identifying customers to contact when a piece of equipment or a substation fails.

With the workforce management system also integrated with AMI, GIS and other sytems, it is possible to send the crews to the exact point of failure.

But there is the data quality issue here, about which I have blogged about on multiple occasions especially relating to the quality of location information, which is a major challenge for many utilities. Poor quality location information about outdoor faciltiies can leave crews struggling to find failed equipment, especially at night or in snow, and can lead to severe delays in service restoration.

August 01, 2012

Monday of this week about 370 million people experienced a power outage, though because outages are common in many parts of India, critical facilities like the Delhi Indira Gandhi airport, hospitals and police stations, and large commercial and industrial users had backup generators.

Then, on Tuesday, there was a 20-state blackout that cut power for between 620 and 680 million people, by far the largest blackout in history. Only about 40 percent of power was back up by mid-afternoon on Tuesday. At this point it is not clear what the cause of the outage was, but power grids are complex and this will take time. I remember that it took months to determine what caused the Northeast power outage of 2003.

In some countries the grid is becoming much more flexible as a result of smart grid and distributed power generation technologies. New technologies make it possible to restore power faster, track outages and restoration in real-time, and isolate sub-grids ("intentional islanding"), even at the home level, which can be kept them going with local power sources when the central grid goes down. Home networks that incorporate renewable energy sources and power storage batteries enable homes to function off the grid. In countries like Bengladesh with a limited central power grid, a significant amount of electric power is completly local and not connected to a central grid, which means that these power users are unaffected by a nation wide outage.

April of last year thirty tornados hit Alabama Power's operating area completely destroying two substations, flattening transmission pylons, breaking 7500 poles, and leaving 400,000 cutomers without power. Fortunately, Alabama Power had installed more than a million smart meters in its service area.

By looking for smart meters that were read the day before and comparing them with the meters that could not be read after the tornados, Alabama Power was able to put together a detailed map of where power had been lost, without making telephone calls, which is what utilities normally do in such cases. They could also tell emergency response officials whether the power was on or off in specific buildings - critical information that first responders need before entering a damaged building. In addition, they could also track power restoration trends as customers started coming back on-line. The smart grid may not be able to prevent an outage caused by this type of event, but it can help get power restored faster and more safely.

The University of Tennessee Power Information Technology Laboratory has used a network of small, low-cost devices that plug into wall outlets to show the electric system dynamics of the Western Interconnect during the San Diego blackout in September of this year. The devices record frequency, phase angle, and voltage every few seconds and transmit the data via the internet to the University of Tennessee and Virginia Tech. GPS-synchronization of the network made it possible to show geographically how the system disturbance propagated.

Intentional islanding

The combination of an intermittent energy source like solar or wind together with a large electric power storage battery (a 36 MWh battery park was recently commissioned in China) represents a source of power that could function independently of the central grid and provide power to a local community. Intentional islanding can be used during a widespread outage like that in India to create power islands which can be designed to maintain a continuous supply of power locally during disturbances to the central grid. Critical facilities like airports and hospitals already employ a form of islanding, taking themselves off the central grid and using emergency backup generators.

A large manufacturer, Panasonic Corporation, has announced that it will start mass-production of a lithium-ion battery energy storage system for homes, specifically for the European residential market.

The battery system consists of a battery cell with a capacity of up to 1.35 kWh and a battery management system designed to control battery charging and discharging. When the sun is shining the battery stores energy generated from solar photovoltaic (PV) panels. At night when the PV system is not generating power, the battery provides up to a kWh of power.

In Bangladesh, renewable energy, primarily solar, is providing access to electricity for rural villagers, who are not likely to have access to the central grid in the near future. The Globle Environment Facility (GEF) is helping to accelerate rural electrification through solar home systems (SHS) in rural areas where people live too far from the main electrical grids. The Renewable Energy and Rural Electrification project seeks to reduce barriers to the use of these climate friendly energy systems and grow the market for renewables. Currently it is estimated that some 80,000 SHSs are installed each month and in the last seven years, over 1 million rural homes in off-grid areas have got electricity through SHSs. Since SHSs are entirely off-grid, they are not affected by nationwide outages like that in India earlier this week.

July 05, 2012

June 2011 to May 2012 was the warmest 12-month consecutive period in the continental U.S. in recorded history. This extreme heat has created conditions conducive to wild fires. In one day NOAA reported thirteen new large fires. Nationally, 52 large fires have burned more than 900,000 acres. Residents near several large fires across the country remain evacuated.

A fire that has attracted national attention is the Waldo Canyon Fire near Colorado Springs where 347 homes have been destroyed and more than 20,000 homes and 160 business have been under threat. Another fire, the High Park fire west of Fort Collins has destroyed 257 homes.

Waldo Canyon Fire gas restoration

To give you some idea about what is involved from a utility perspective this is a summary of Colorado Springs Utilities (CSU) gas restoration efforts as of July 3 at 11 am. CSU is responsible for electric power, water and gas services in the Colorado Springs area. CSU reported that about 3000 CSU customers are back in their homes and businesses in the evacuation areas of Peregrine and Oak Valley Ranch. CSU has restored gas service to a little over a half of these homes and will continue working through July 4 to restores the rest. Crews from Xcel Energy have arrived and are helping. In the particularly hard hit area of Mountain Shadows which remains evacuated, engineers have determined that 1200 homes are habitable, but that the fire has ravaged the gas system. In this area the restoration process will take much longer than in other areas. 12 miles of distribution line are going to have to be restored and 350 service lines to homes that were destroyed or damaged need to be capped. Federal regulations require these caps be tested for 24 hours to ensure that they are safe. During this process the hard hit areas of Mountain Shadows areas must remain evacuated for safety reasons.

I must add that we at Autodesk recognize the tremendous commitmment and effort that utility folks, both in the field and in the control center, make in dealing with these emergencies.

Technology is changing disaster management

Technology is changing how utilities respond to disasters like the fires in Colorado. One area that has changed dramatically is how the utility communicates with the public during and after a disaster. The other area is new sources of geospatial data that can provide much more timely information, what some refer to as near real-time data, during both the disaster and the restoration period.

Data quality

One of the problems that utilities face is data quality, especially relating to the location of underground facilities. The infamous gas explosion in Belgium in 2004 immediately comes to mind because poor data meant that the construction crew did not know that there was a gas pipeline where they were excavating. In many disasters such as the fires in Colorado, it is crucial that you know exactly where underground infrastructure is. Colorado Springs Utilities has been making a major effort to implement standards-driven processes, built-in analysis, and materials management to improve workflows which helps field crews during disasters by contributing to improved data reliability.

Social media

One of the areas which has dramatically impacted how utilities respond to disasters is communication with the public. Not only does CSU have a web site with up to date information about the fires, areas affected, and restoration efforts, but CSU is also on Facebook. For example, this was posted just Tuesday evening.

"As of this evening, we've restored service to 2,400 homes. Crews will hit the road again July 4. Thank you for your patience and support. We are proud to serve and call this community home. - Patrice"

Extensive flooding occurred in many areas of Queensland during late December 2010 and early January 2011, exacerbated by Cyclone Tasha, a category 1 tropical storm, with three quarters of the state, or 1 000 000 km2 declared a disaster zone. In some areas flooding was so severe that mandatory evacuation was imposed on several towns. At GITA ANZ last year, Mark Volz of Ergon Energy gave a fascinating presentation on how Ergon monitored flooding and managed electric power during the disaster.

Ergon had to identify facilities where the power needed to be turned off and also had to keep the police informed of the status of electric power facilities across the state. Mark used a digital elevation model (DEM) of Queensland from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) global digital elevation model. Satellite radar images, produced by the COSMO-SkyMed satellite constellation, of the flooded regions of Queensland which has been supplied to the Australian authorities by e-GEOS were used by the University of New South Wales in Sydney, with the support of the Land & Property Management Authority and of the Cooperative Research Centre for Spatial Information, to compile temporal maps showing how the extent of flooding changed with time. The maps were sent to the Australian authorities within six hours of the images being received. Using the DEM data and the satellite imagery Mark was able to identify Ergon facilities that had been inundated.

Prior to the tornadoes Alabama Power had deployed about 1.4 million smart meters in its operating territory. By looking for smart meters that had been read the day before and comparing them with the meters that could not be read after the tornadoes, Alabama Power was able to put together a detailed picture of where power has been lost, without making telephone calls. The application could also tell emergency response officials whether the power was on or off in specific buildings - critical information that first responders require before entering a damaged building. In addition, Alabama Power could track power restoration as customers started coming back on-line.

Real-time network monitoring

One of the most important challenges for modern grid control is being able to monitor a large number of intelligent devices in real-time. Burlington Hydro has implemented a real-time monitoring system that integrates their SCADA system, 65 000 smart meters reporting power use every 15 minutes, the supporting automated meter infrastructure (AMI), power line sensors, customer information system (CIS), their ERP system, engineering analysis, and other systems. It utilizes bidirectional communications to both receive information from and control smart devices. The system has to handle very large data volumes. For example, some devices are reporting 60 times per second. A web browser, either on the desktop or on a mobile device, is all that's required to access tools for asset maintenance, cable locate, asset management, operations, financials and drill downs, automating network pinning and work protection tagging, SCADA, schematic views, automated CAIDI, SAIDI and SAIFI reporting, an outage management system (OMS), mobile workforce automation, automated as-built management, real-time asset monitoring and analytics, and an executive dashboard. For example, this allows live monitoring of all transformers on the grid. The transformer status monitoring dashboard not only shows a map with transformer loading in real-time in the form of a heat map, but also reports historical loading and can even estimate, based on the history of oveloading on a particuler transformer, how much the lifetime of the transformer had been shortened as a result of the overloading. It also allows the operator to reconfigure the grid in real time to reduce the load on overlloaded transformers and redistribute it to others with available capacity.

Post-disaster imagery

Post-disaster high resolution imagery helps governments quickly and cost-effectively assess and manage damage assessment and rescue efforts following a disaster. For example, Pictometry will image up to 200 square miles of affected areas of Federally declared disasters caused by hurricanes, tsunamis and earthquakes to customers at no charge, often within a few days. Pictometry even provides change analysis software to help quantify the impact of a disaster for several months following a disaster.

Crowd-sourcing

One of the novel developments that is changing how information is collected and disseminated is crowd-sourced data, data that is uploaded directly from the field. It the case of the Queensland floods a web site was developed by Mark Volz's team at Ergon Energy. The web site became the single source for all information relating to asset damage. The source of this information was what Mark called "internal crowd-sourcing", asset damage reports from operations staff in the field uploaded daily to the site.