USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

In the U.S. electricity regulation is the responsibility of the states and there is a wide variation in regulation of the emerging electric power markets. It is apparent to virtually all state regulators that under the impetus of distributed energy the utility industry is having to change its business model. Because there are many different business models with which the 3,500 or so utilities are trying to meet the challenge of distributed energy, many state regulators are finding it a challenge to create a new regulatory framework for this new energy world.

In the 1990s and early 2000s a number of states decided to deregulate their electric power markets to encourage competition. Typically this meant requiring incumbent utilities to divest their generation facilities and become regulated transmission and distribution entities. Deregulation encouraged independent power retailers to enter the energy market. It was hoped that deregulation would reduce costs and provide consumer choice. In restructured states utilities are barred from owning power plants so that a competitive energy market could emerge. Private energy providers have argued that the same rules should apply to distributed energy, so that utilities do not use their market power to dominate the renewable energy market.

According to the Edison Electric Institute in 1995 generation accounted for about two-thirds of the price of electricity. Today, the cost of generation is less than half of the price of electricity. In many parts of the country the wholesale price of electricity has dropped significantly while the retail price has increased - in other words, electricity is cheap, the grid is expensive. This has resulted in disintermediation where large industrial and commercial customers in states with high regulated rates have shifted their power purchases to lower-cost energy providers or have decided to self-generate.

There is a lot of diversity among the alternative business models and regulation that are being adopted by the 3,500 utilities in the U.S. At the SGIP conference, California, New York, and Texas were the states that were most often mentioned.

California

Utility-owned photovoltaic installations account for 52 percent of the solar capacity installed in the United States, according to the Solar Energy Industries Association. In California Southern California Edison (SCE) and PG&E have invested in utility-scale solar power plants, but they buy most of their renewable power from plants owned by private investors. For example, the Ivanpah solar plant is owned by private investors including Google, and both PG&E and Southern Cal Edison have long term commitments to buy the plant's power. California also encourages consumer rooftop solar through a net metering program that pays retail rates for solar PV power generated by consumers.

Sacramento Municipal Utilities District (SMUD) is one of the most forward-looking utilities in North America. It was the first California utility to reach 20% renewable power and the first to commit to 33% renewables. SMUD is moving from a centralized utility with a business model based on selling electricity to a distributed utility providing localized grid services. SMUD is getting out of the business of selling electricity, and into the business of selling grid services.

One of the most interesting innovations at SMUD is in the area of rate structure. SMUD and other utilities are in the interesting position for a retailer of trying to sell less of its product, in this case electricity. People are using less electricity as a result of personal interest as well as SMUD's own energy efficiency programs. But the money has to come from somewhere. SMUD has introduced a flat infrastructure fee. Currently every customer pays $18/month for the grid, independent of how much power they consume. SMUD has determined that $28 is the breakeven point, where the cost of maintaining the grid would be covered by infrastructure fees, and is moving towards that monthly change.

Another interesting innovation at SMUD is its solar power program. Customers have the option to buy into solar power without having to install solar panels on their roofs. The first solar program was a 1 MW solar PV program designed for residential customers. The next was the recently opened 11 MW program for commercial customers. The new solar plant at Rancho Seco Solar Power Plant is owned by D. E. Shaw Renewable Investments. Power purchased from the project will provide energy for SMUD's commercial SolarShares® program. The SolarShares program is designed to provide business customers with a low-cost solar alternative to installing solar panels onsite.

Texas

In 1999, the Texas Legislature began restructuring the state’s electric industry which allowed consumers to begin choosing their retail electricity provider. As of Dec. 31, 2001, deregulation of the ERCOT electricity markets required investor-owned utilities to unbundle their operations. The provision of service to end-use retail customers became competitive. There are over 100 retail electricity providers of which more than a quarter offer a 100% renewable plan to their customers as an option. Since 2002, approximately 85% of commercial and industrial consumers have switched power providers at least once. Approximately 40% of residential consumers in deregulated areas have switched from the former incumbent provider to a competitive retail energy provider. One 2011 study found that Texas customers paid up to one-third less in 2010 compared to 2001 before competition and energy deregulation.

In 2007 TXU, the largest utility in the state, was acquired by a consortium and broken up into electricity distribution called Oncor Electric Delivery, power generation called Luminant, and an electricity retail provider TXU Energy without any electrical distribution or generation assets.

At the SGIP conference Cheryl Mele, Senior Vice President and Chief Operating Officer at ERCOT, pointed out that wind penetration has at times provided 48% of Texas power demand. Over the past year wind provided 12% of all power generation. Some electric utilities in Texas, most of whom participate in the ERCOT grid, invest in renewable energy generation. For example, NRG Energy, an integrated energy company, owns wholly or partially 1.1 GW of wind and solar power.

New York

The state that is getting the attention of state regulators is the New York Public Services Commission (PSC). In the United States, a number of states have restructured their electric power industry with the goal of opening what has been a monopoly market to open competition. In New York, the regulator PSC has been aggressive in changing regulation ("Reforming the Energy Vision" or REV) to promote a competitive electric power market.

The New York Public Service Commission (NYPSC) has taken a major step toward a future of transactive energy. New York's "Reforming the Energy Vision" (REV) is moving the New York state electric utility industry toward a radical redefinition of utilities as we have known them over the past hundred years since Tesla and Edison created the electric power industry. In the future in New York State the utility industry will be comprised of distributed system platform (DSP) providers, basically providing the grid but not directly selling energy, and a competitive energy market with consumers and many energy providers including bulk power generators and distributed energy generators including you and me with rooftop solar panels. This model has similarities with the UK model which is disrupting the traditional energy market in the UK..

In many of these states in response to policies promulgated by state regulators utilities in are increasingly looking at “non-wires alternatives” to address system overloads. Non-wires alternatives include distributed energy generation (DER), demand response, energy efficiency and other non-traditional means to address system overloads instead of building conventional transmission and distribution infrastructure. State regulators are asking utilities to use non-traditional solutions to address grid problems and in some cases utilities are asking the competitive market to come up with ideas.

For example, ConEd and New York State Electric & Gas (NYSEG) circulated solicitations for non-wire alternatives. ConEd is interested in demonstration projects to determine whether utilities can derive significant market-based benefits from integrating energy storage (utility-scale batteries) onto their electric systems. ConEd has suggested that storage could reduce operating expenses, reduce peak demand for generation, manage distributed energy resources, or provide customer benefits, such as power quality or reduced risk of outages. New York State Electric & Gas has issued a request for proposals for non-wires alternatives including distributed generation, demand response, energy efficiency, and energy storage that can reduce peak loading on a transformer bank. The utility hopes to defer an expensive upgrade to one of its substations. In both cases the utilities have excluded fossil fuel proposals, except those that use natural gas or propane.

But if these solicitations involve distributed energy or storage, the PSC will only allow utilities to own distributed energy sources under very restrictive circumstances.

The utility issues a solicitation, but the market response is inadequate or too expensive

A project consists of energy storage on the utility's property and integrated into the utility's distribution system

A project will enable low or moderate income residential customers to benefit from the resource where markets are not likely to satisfy the need

A project is being sponsored for demonstration purposes.

For example, Consolidated Edison (ConEd) had an electrical overload problem in Brooklyn and Queens. Instead of building a new substation estimated to cost $1 billion, it proposed a non-wires solution involving fostering the development of local energy sources, demand management and energy efficiency. As part of the solution, ConEd planned to install battery storage on its own property. In this special case the commission approved the storage ownership by ConEd because it would be on the utility’s property and integrated into its system. But it reiterated that it will only allow utilities to own distributed energy resources when the open market does not offer a solution.

Midwest

In Illinois in 1997 the Customer Choice Act mandated that state utilities could no longer own generation facilities. During the Mandatory Transition Period 1997–2007, utilities were required to sell their electricity generation assets to other energy companies. Utilities no longer sold power and became responsible for delivering electricity only. Since 2002 in the deregulated market, customers were permitted to buy electricity from competing retail electric providers. In 2006 the Retail Electric Competition Act encouraged residential and small business customers to switch to alternative electric providers.

in July 1999, Ohio restructured its energy market to give consumers choice with their energy provider. The law took effect on January 1, 2001. Customers could now choose to buy energy from retail electric suppliers instead of automatically receiving it from the utility company (primarily AEP Ohio, Dayton Power & Light, Duke Energy, and FirstEnergy) in their area. In May 2008, each incumbent utility was required to shed its power generation operations and become a purely electric distribution utility.

FirstEnergy, headquartered in Ohio and one of the largest investor-owned electric utilities in the U.S. with customers in Ohio, Pennsylvania, West Virginia, Maryland, New Jersey, and New York, has just annnounced a strategic decision to exit the competitive power business not just in Ohio but in all its service territories. FirstEnergy is getting out of the business of generating and selling power and plans to become purely a regulated transmission and distribution entity.

Southeast

In the Southeast Duke Energy is a vertically integrated energy company with generation, transmission and distribution assets. Duke Energy owns, operates or maintains renewable energy assets in its regulatory service territories and through Duke Energy Renewables.

Since 2012, the Southern Company system has added or announced more than 4 GW of renewable generation. Southern Power owns more than 1,600 megawatts of solar generating capacity at 26 facilities operating or under construction in California, Georgia, Nevada, New Mexico, North Carolina and Texas. Seventeen of these facilities are co-owned by third parties with Southern Power having the majority ownership. Georgia Power, Southern subsidiary, is a national leader in solar energy. Georgia Power began offering rooftop solar to customers in July of 2015. Alabama Power, another Southern subsidiary, is currently constructing two 10.6-megawatt solar facilities.

Florida has had a law for nearly a century that gives utilities a regional monopoly on power production. In some states solar companies like Solar City can install panels for free and then sell the power to the home or business owner at a rate lower than local utilities. These third-party sales are generally illegal under the Florida law that gives utility companies a local monopoly on supplying power.

Nevada

The Nevada Public Utility Commission has just re-established its previous, favorable rates for about 32,000 customers that already own rooftop solar panels in Nevada. These solar customers had been facing a significant rate hike. However, new solar customers will face the new rates that were changed in December. Solar City and other solar providers have said that the new rates make it uneconomic for them to do business in the state. NV Energy is developing its own solar energy sources, often together with large customers who have requested solar energy. Three casinos opted to leave the grid, but the exit fees for leaving the NV Energy grid are so exorbitant that one of the casinos (Wynn) is opting to contest the fee in court. Another casino opted to collaborate with NV Energy in the development of a solar PV farm. The other casino (MGM) has opted to pay $86 million to leave the grid and rely on its own huge rooftop solar array on the Mandalay Bay and to buy the remainder of its power needs on the open market.

Summary

It should be clear that there are a lot of different approaches to creating, regulating and adopting a new business model among the 3,500 or so electric power utilities in the U.S. A consensus on the way forward for U.S. utilities has not been reached and it appears that the different regions will likely end up with different business models and different regulatory policies. It is clear that that whatever the business model and regulatory framework that are ultimately adopted in different regions, they all will have address the big issues of increasing customer choice, a growing array of consumer, commercial and industrial applications (hardware and software) that interact with the grid, distributed energy, microgrids which may or may not be connected to the grid and transactive energy. Gartner has forecasted that by 2020 the largest electric power utility will be an Uber-like entity that will neither own infrastructure nor generate electricity, but will manage energy flows by putting energy producers together with energy consumers.

November 09, 2016

Anne Pramaggiore, President and CEO of ComEd of Chicago, was the first at the Smart Gird Interoperability Panel 2016 Grid Modernization Summit in Washington DC to call the current business transformation that the electric power industry is experiencing a revolution. But many other speakers agreed with this characterization. Frequently mentioned were the regulatory and utility business transformations occurring in California and New York (REV). Sacramento Municipal Utility District (SMUD) was mentioned as an example of a utility that was aggressively driving change in the utility business model.

Robert Wilhite of Navigant Research pointed out that their recent research had found that 90% of utilities recognize that utility business models have to change. But there are different opinions about the urgency of this transformation. Navigant found that about half believe that it has to happen immediately, while the remainder believe it can happen later.

Of course none of this would be possible without the enabling technology of the smart grid. Dave Velazguez, President and CEO of Pepco, now part of Exelon, recollected that the first smart grid development at Pepco, by no means a laggard in adopting new technology, occurred only a decade ago.

Distributed energy

Virtually every speaker singled out distributed energy (DER), especially solarPV, as the main motivating driver that is forcing the business transformation of the utility industry. Originally seen as a response to a warming climate, solarPV has achieved grid parity in several parts of the country and is seen as a way of providing customers not only with a choice of a green energy source but increasingly as a less expensive alternative to the local utility. The price drop has been so substantial that even at a time of unusually low natural gas prices that are responsible for closing many coal-fired generation plants solarPV is achieving grid parity in many parts of North America. Another important contributor to the revolution is the electrification of transportation, not only EVs but also buses, rail and even trucks.

But none of this could have happened without the enabling platform of the smart grid. The conjunction of technological advances in packet switched wired and wireless communications networks, intelligent electronic devices underlies the DER advances that have created choice for customers. As we have seen previously in other industries such as telecom consumers have grasped the opportunity and are now beginning to drive changes in the energy market.

Microgrids

Another topic that came up repeatedly was microgrids. Duke Energy has been running microgrid pilot for several years from which emerged the OpenFMB standard with an OpenFMB working group to support the effort. ComEd has five microgrid pilots and is starting the country's first microgrid cluster. Pepco has two microgrid projects underway.

At the Distributech 2016 and at this SGIP Summit an implementation of OpenFMB demonstrated live by Green Energy Corp. It was also announced at the summit that the source code was now available for download from the openFMB.io site.

In addition a number of vendors were offering solutions for managing microgrids including the KEPCO Consortium from Korea who has built a comprehensive microgrid solution.

Geospatial

Although this was not a geospatially-focussed conference, GIS and geospatial technology came up in several areas.

Chanda Visweswanah, an IBM fellow, outlined in three topics what the the challenge is in transforming the grid. First, there are several trends that are behaving exponentially, the number of sensors, the volume of data, and renewable energy sources were among them. Secondly, we are increasingly relying on intermittent energy sources like wind and solarPV. Third, this is not you grandmother's software. This is software that provides a platform that supports sophisticated mathematical algorithms. One of the most important of these are algorithms that nowcast and forecast weather at a very high resolution so that it is possible to predict wind velocity not at ground level but at the elevation of the turbines and at the location of the wind farm. Similarly for insolation for solarPV. These wind and insolation maps are effectively are the generation forecast minutes and hours in advance.

The Kepco Consortium demonstrated the μ-Grid platform. Kepco have already partnered with PowerStream in Ontario to provide a utility-scale microgrid. Earlier this year PowerStream and Kepco officially launched a utility-scaled Microgrid, which has the capacity to provide several hours of backup power supply for approximately 400 customers residing or owning a business in Penetanguishene. The Microgrid is helping to reinforce the existing grid by increasing resiliency and operational flexibility. The platform provides a GIS Studio which allows grid 35 types of power facilities such as transformers to be visulaized and edited on a map, either imagery form satellites or overflights or a topographic map.

Another application of geloocation is in the area of cybersecurity where location is used by advanced cybersecurity applications for verifying identity.

Distributed intelligence

Several speakers pointed out that the volume of data has simply become to large for a centralized IT system that collects all the data at a central control point where analysis and visualization is performed. By pushing intelligence out into the field, it reduces the data volumes that have to be handled at the central station and reduces the data flow over the network.

Standards

Orange Button

Based on the very successful Green Button program, an important standards initiative is Orange Button. It was initiated by the Department of Energy (DOE) with the objective of making it easier for investment to flow into solar power by standardizing electronic reporting of solar installation performance. Orange Button targets a reduction in soft costs by streamlining the collection, security, management, exchange, and monetizing of solar datasets across the value chain of solar. It is expected that creating an industry-driven standardized data landscape will facilitate the growth and expansion of distributed solar. As the solar market continues to expand, it is critical that the collection, management, and exchange of solar datasets – especially those that affect the bankability (ability to secure loans for solar installations) of solar assets – are coordinated and streamlined to protect consumers, increase efficient pricing, and support new and existing businesses entering the solar marketplace. In April of this year four awardees (kWh Analytics, National Renewable Energy Laboratory, Smart Grid Interoperability Panel, and SunSpec Alliance) were announced by the Department of Energy that will collaborate on the overall goal to standardize the way solar data is collected and exchanged. SGIP was chosen to lead the effort.

At SGIP the effort is being led and coordinated by Dixon Wright of Wells Fargo. Dixon is the only person that I was aware of representing a financial institution at the SGIP event. I think that the presence of a major financial player such as Wells Fargo is a strong indication that solar is moving into the big leagues of financing. Dixon is passionate about the Orange Button initiative and believes that once a draft standard and implementation becomes available and a few users champion it, it will catch on rapidly with the large financial institutions and the solar industry.

OpenFMB

In March 2015, the Smart Grid Interoperability Panel (SGIP) formally kicked-off its OpenFMB effort to foster field interoperability drawing upon Duke Energy’s previous Coalition of the Willing I and Distributed Intelligence Platform work. By the 2015 SGIP Annual Conference in November, three microgrid use cases had been developed and demonstrated: optimization, unscheduled islanding, and grid reconnection.

In February 2016, the first OpenFMB reference implementation, utilizing SGIP’s microgrid use cases at Duke Energy’s Mount Holly microgrid test center, was demonstrated at the 2016 DistribuTECH Conference by Duke Energy’s Coalition of the Willing II (COW-II) vendor partners. For the annual SGIP Grid Modernization Summit in November 2016 the SGIP OpenFMB task force published a series of related foundational use cases that center on Distributed Energy Resources (DER) Circuit Segment Management for the active coordination of power systems equipment to DER, including a microgrid. Green Energ Corp demonstrated an OpenFMB implementation connecting a solar PV installation with other equipment and applications. This is an open source project and it was announced at the SGIP event that the source code is now available and can be downloaded from openfmb.io. There is a free webinar replay "SGIP Grid Modernization Summit OpenFMB Demo Preview."

Interoperability

Several speakers were on hand from the Department of Energy's Gris Modernization labs. Since this was an SGIP event they focussed on the research underway at 13 Federal labs and 150 partner organizations that involves interoperability. First of all research is underway to find a way to measure interoperability. Secondly, there is an important initiative underwritten by several million dollars of financing to determine how to value interoperability. If we can quantify the contribution it makes to business efficiencies by enhancing the opportunity for competitve products, by creating standards-based platforms that intelligent products from different vendors, whether smart meters, synchrophasors, reclosers, or substation components can plug and play with, that ultimately reduces prices for utilities, then it will be easier to justify and priorize greater investment the standards and implementations that provide interoperability.

November 08, 2016

Important new research on the reliability of the generally available technology for detecting and gelocating underground utility infrastructure and the costs of hitting utilities during excavation (known as strikes in the UK) was presented at the Year in Infrastructure 2016 conference in London. Dr. Nicole Metje, Professor of Geotechnical Engineering at the University of Birmingham, presented some of her most recent research on a major problem affecting the construction industry worldwide.

Background: utility strikes in the UK

About 4 million excavations are carried out on the UK road network each year to install or repair buried utility pipes and cables. Not knowing the location of buried assets causes practical problems that increase costs and delay projects, but more importantly, it increases the risk of injury for utility owners, contractors and road users. The problems associated with inaccurate location of buried pipes and cables are serious and are rapidly worsening due to the increasing density of underground infrastructure in major urban areas. In the U.S. it is estimated that an underground utility is hit about every minute and underground utility conflicts and relocations are the number one cause for project delays during road construction.

The standard tools in the U.K. for detecting underground utilities are the cable avoidance tool known as CAT. The CAT can detect signals naturally radiating from metallic services or in conjunction with a Genny that applies a distinctive signal that the CAT can detect. The adoption of ground penetrating radar is increasing, but compared to CAT it remains relatively expensive and requires trained operators.

Underground asset detection is the last frontier of remote sensing. In the U.K. it has been perceived as a serious problem since at least a decade ago. The Mapping the Underworld (MTU) project was initiated in 2004 to develop a multi-sensor platform to locate and map in 3-D the location of all buried utility assets - without excavation. The 10-year research program, which was largely funded by the Engineering and Physical Sciences Research Council (EPSRC), involves the University of Birmingham, University of Bath, British Geological Society, Engineering and Physical Sciences Research Council (EPSRC), Univeristy of Leeds, Newcastle University, University of Sheffield, University of Southampton, and UKWIR. The project concluded that a combination of multiple technologies is essential for the reliable detection of the buried assets. The project aimed to develop a remote sensing platform that combined several detection technologies together with software that optimizes the combined data. To achieve a 100% location success rate without disturbing the ground is the 'holy grail' of underground remote sensing. The benefits of knowing where underground infrastructure are well documented.

Since then, a second major multi-disciplinary, multi-university research project, Assessing the Underworld (ATU) has been funded, largely by the EPSRC. The project aims to create prototype multi-sensor devices that will enable the condition of underground assets to be assessed remotely without excavation. The technologies include above ground remote sensing and robotic devices that can be deployed in a water distribution or sewer pipeline.

In 2012 the U.K. the civil engineering contractor's association CECA conducted a survey of its members on the topic of underground strikes. It asked questions about the technologies used and whether the respondent has internal processes for tracking and estimating the costs associated with utility strikes. I have paraphrased the questions.

Does your organization quantify the direct costs of utility strikes on a regular basis ?

Yes 57%

No 43%

Do you track and quantify other costs (indirect and/or social) incurred during service strikes ?

Yes 29%

No 71%

Do you have a separate programam or project for determing and recording the location of underground utilities program ?

Yes 14%

No 86%

Does your field staff use a Genny as much as it should be ?

Yes 14%

No 86%

Do you think underground utilities would still be damaged during excavation if your staff used a Genny as often as it could be ?

Yes 0%

No 100%

This suggests that even the limited (see below) tools available today for underground detection are not used to their fullest extent. The direct costs of utility strikes are still not quantified and tracked in many firms, and very few firms are aware of the indirect costs associated with utility strikes. Probably as a result of not knowing the costs, very few firms have an explicit program for geolocating underground utilities.

Risk of underground utility strikes

Dr. Metje presented new results based on her research on utility strikes in the U.K. She pointed put that unlike the U.S. where collecting statistics on underground hits is relatively easy, in the U.K. it is a challenge. In spite of the obstacles, she managed to collect data on 3348 strikes defined as incidents during excavation which involved damage to underground utilities. Some of the very interesting results of the analysis include

Of 255 strikes where pre-excavation CAT scans had been carried out, in 52% of the cases the utility had been detected before the strike.

Of 187 strikes where utility plans/drawings (as-builts) had been reviewed before excavation, in 48% of the cases the utilities hit were shown on the plans.

Of 89 strikes where the plans/drawings (as-builts) showed the utility, in 84% of the cases respondents reported that the location of the utility was inaccurately plotted. 16% said it was accurate.

In other words even when it is known that there is utility infrastructure underground, there is still a very significant risk of hitting it - perhaps because its location is not known accurately enough. It also provides further evidence that, in general, as-builts cannot be relied on, either because they are "as-designed" or because they have not been maintained and do not record modifications that may have been made since the infrastructure was first built.

Cost of hitting underground utilities

Dr. Metje presented important results on the direct costs of utility strikes in the U.K.

Electricity

£ 970

Gas

£485

Telecom

£400

Fibre-optic

£2800

Water

£300-980

However, Dr. Metje has found that the true costs associated with utility strikes is much higher than this. Direct costs include the costs of sending a crew to assess and repair the damaged pipe or cable. Indirect costs include the impact of traffic disruption as a result of the strike, any injuries and other impacts on the health of the workers directly involved or people in the immediate neighbourhood, and the lost custom that businesses would have experienced as a result of the traffic disruption. Dr Metje's research suggests that the true cost much larger than the direct cost. Her estimate is that the true cost is about 30 times the direct cost. This is a startling result which indicates that the cost to society of underground utility strikes is much, much larger than is generally believed.

In the U.K. Publicly Available Specification (PAS) 128 governs the recording of the location of underground utilities. PAS 128 was developed under the auspices of the British Standards Institution (BSI) and sponsored by the Institution of Civil Engineers (ICE). PAS 128 is aimed at the practitioner - surveyors who make their living detecting and reporting the location of underground utilties for construction contractors and utilities. The primary objective is to reduce risk for construction contractors. In the UK, unlike the U.S. and Canada where we have one-call centres, in the U.K. if a utility is disrupted by a construction project, the liability lies entirely with the contractor.

PAS128 has aspects of both the U.S. and French system. It is similar to the US ASCE standard in that codes (A) through (D) relates to how the location was captured. The British standard is different in that it adds sub-categories B1 through B4 which specify the absolute precision of the remote-sensed location.

QL D - Location of underground structures determined by a review of existing utility (paper) records QL C - A physical reconnaissance of the site has been performed identifying features of the network are visible above ground QL B - Remote detection technology such as electromagnetic or ground penetrating radar have been used to detect the location of the underground facilities. There is a sub-classfication B1-B4 that specifies the estimated precision of the measurements. For example, B3 corresponds to +/- 0.5 meters. QL A - Verification, typically by potholing

Dr. Metje reported revealing research aimed at assessing the PAS128 standard and also validating the reliability of the MTU (Mapping The Underworld) sensor technologies on a site where underground truth could be determined. Five commercial companies were invited to conduct a utility survey of the chosen area to PAS128 standard. The firms were asked to survey the underground utilities in the same area and report location in the form of a map. This was compared with the results of a survey conducted with the MTU platform. The firms agreed on the location of wastewater and storm sewers and water utilities (pipes). But in the case of the location of telecom and electric cables there was substantial disagreement - the total length of telecom and electricity cables detected by different firms varied widely, from 90 to 240 meters. This is a startling indication of the risk associated with existing underground utility detection technologies.

October 31, 2016

At the Year in Infrastructure 2016 conference in London the Be Inspired Awards set some new records. There were 311 submissions from 22 countries. Of these 110 were completed and operational projects and 120 were under construction. 60 projects were chosen as finalists in 18 categories including asset performance, power generation, rail and transit, mining, roads, utilities and communications, water network analysis, and construction.

15 of the 60 finalists reported using reality capture which is evidence of remarkably rapid adoption since Bentley only acquired a reality capture capability (Acute 3D since renamed as ContextCapture) in February of last year. At this year's event Bentley announced that ContextCapture can now process LiDAR point clouds.

October 25, 2016

Introduction

Building Information Models are comprehensive digital representations of buildings which include 3D geometry and associated data. They are widely used for designing, engineering, and constructing buildings (vertical BIM) and infrastructure (horizontal BIM). Owners are beginning to see value in BIM models and data for operating buildings. A BIM model includes not just the 3D geometry of all building elements and the enclosed spaces, but also semantic information representing both the type of the elements, the relationships between them (defined by the data model) and data associated with the geometric elements. With BIM models being mandated in countries like the UK, Finland and Singapore, and in France and Russia in the near future, and the increasing adoption of BIM by the private sector in the U.S. and other countries, the volume of BIM data is growing rapidly. By merging models from different disciplines into common virtual models, the amount of information stored in such multidomain repositories quickly becomes large and complex even for average projects. Centralized or distributed repositories enable the creation, integration and maintenance of large quantities of data from various stakeholders involved in the planning and building process and the wide range of specialized software tools used by these stakeholders. Many ways of creating, manipulating and visualizing BIM and CAD models have been developed including CAD, Solibri, Navisworks, GTPPM, mvdXML, PQML, EQL, SQL, XSLT/XQuery, SPARQL, Gremlin, Linq, SDAI, and generic programming languages. However, the development of a general, intuitive query language for BIM models including geometric and related information remains a challenge.

The information needs of individual stakeholders in different parts of the construction process are very different. For example, a structural engineer does not require all the information available in the overall building model such as the detailed specifications of a suspended ceiling. This makes the extraction of partial model subsets or domain specific views on large models necessary. In addition regulations, energy performance modeling, and building maintenance require finding specific facilities among the building assets.

In the last few years research has focussed on the development of an open, domain-specific query language for BIM. Most of the this research has focussed on querying, updating and deleting data stored in Industry Foundation Classes (IFC) models. The Industry Foundation Classes (IFC) is a vendor neutral and open model that defines more than 650 entities and a few thousand attributes. Most industry software BIM tools are able to export IFC models, though there is typically some information loss. The bimserver.org platform is an open repository of IFC models and already provides a Java API to extract partial building information models from a repository.

Open query languages

Many existing database applications use the Structured Query Language (SQL), an ANSI/ISO standard with spatial extensions standardized by the Open Geospatial Consortium (OGC), as the standard language. SQL makes it possible to create, read, update and delete records in a Relational Database Management System (RDBMS). But the applicability of SQL in a BIM context is limited. SQL is based on operations on data stored in tables. Application to IFC, as a BIM example, would require storing 600 entities along with their thousands of attributes in individual tables. Severe performance and scalability issues are associated with using SQL for BIM. In addition the complexity of IFC-based queries would rapidly become too complex.

The W3C Data Activity has defined a collection of technologies comprising a “web of data” where information is easily shared and reused across applications. Key parts of this technology stack are the RDF (Resource Description Framework) data model, the OWL Web Ontology Language and the SPARQL protocol and RDF query language which has been standardized by the W3C. The Resource Description Framework (RDF) is a directed, labeled graph data format for representing information on the Web. RDF and SPARQL enables interoperability among distributed heterogeneous information systems making possible queries that span several repositories and link information from different sources. Specifically in a BIM context it enables ad hoc queries of sub models that reside in repositories maintained by the different stakeholders in a building project.

Spatial queries

BIM data models can be simple or complex. A complex data model includes logical relationships between building elements, such as windows and walls, walls and storeys, and so on. These relationships are created at design time. A simple model may not include any explicit relationships. For example, there may not be any way to determine which windows are associated with a wall except visually. Spatial queries enables relationships to be determined after the creation of the BIM model. For example, a spatial query can find all the windows in a wall, all the walls of a storey, or all the HVAC and plumbing facilities in a storey.

The efficient implementation of spatial BIM queries requires 3D spatial indexing such as octree and multi-dimensional r-tree indexes and algorithms. It has been known for a long time that spatial indexing is a key technology enabling efficient GIS systems. Very early on in the development of spatial indexes it was realized that these are domain specific. Several GIS systems support an extended spatial SQL based on quadtree or r-tree indexes. These systems evolved historically from a 2D geometric representation approach. However, the functionality for processing 3D geometry in many GIS systems is restricted and the performance of the available tools is unsuitable for complex building data sets. Therefore, it has been argued that a specialized tool is required to handle queries against elaborated BIM models containing vast numbers of elements and detailed geometry representations. Such a query language enables the spatial analysis of BIM models and the extraction of partial models that fulfill certain spatial constraints. Among other features, it includes operators that reflect the topological relationships between 3D spatial objects including "within", "contain", "touch", "overlap", "disjoint" and "equal" in 3D space. Both octrees and multi-dimensional r-tree indexes have been investigated as a basis for an efficient BIM query language.

Open BIM query language

A BIM query language, dubbed BIMQL, has been proposed as an extendable, open, domain specific query language for BIM models. It can be used to select and update partial aspects in building information models. A limited implementation of the language has been built on top of the bimserver.org platform. It is not complete, in particular it does not support spatial queries, but has provided a useful vantage point for future research, discussion and development on the development for end-user interfaces to complex BIM models as well as the composition of service-oriented architectures.

Unlike traditional data modeling approaches which are limited by the scope of their underlying schemas, RDF and related technologies can provide an open and common environment for sharing, integrating and linking data from different domains and data sources. In comparison with domain-specific query languages, SPARQL is especially applicable for scenarios when data from multiple sources are needed. Some important scenarios include

All building objects should be classified according to the local building code

Where are the locations of companies which produce the materials used in walls placed in the hall ?

The interesting thing about these scenarios is that they require the BIM geometry and data plus other external data. This is the type of query that the W3C semantic web technologies are designed to support.

In the AEC domain, a typical use case is data integrity validation against BIM standards or building code requirements. A Dutch example is the Dutch BIM standard requires that all building elements should be associated with appropriate building storey. Provided that the building model is represented in the standardized ifcOWL, it is possible to check this spatial containment relationship using off-the-shelf SPARQL implementation.

Currently SPARQL does not contain the vocabularies or functions required by many use cases in the construction domain. Recently researchers have proposed extending SPARQL with a set of domain-specific functions for the construction world. There are precedents for this strategy. In 2011 the Open Geospatial Consortium (OGC) released GeoSPARQL, a geographic query language for RDF data, which extended SPARQL with geospatial functions that enabled spatial queries. The advantages of this approach is that not only are the extended AEC functions compatible with existing SPARQL environments, but they can also be used to query building data combined with data from other fields.

But standard SPARQL can only crawl the data graph according to the structure of the BIM data model. There are many use cases that depend on geometric relationships that cannot be handled by off-the-shelf SPARQL. To address this SPARQL has been extended to include spatial operators "touches", "intersects", "disjoints", and "contains" and well as other domain-specific functions.

An example of a simple spatial query using a spatial extension is finding all walls which contain or adjoin doors.This requires the spatial function "touches" to find doors that are in or near walls.

As in the case of geospatial data, additional domain-specific functions are required to provide the necessary functionality. For example, functions were added for different ways of measuring distance (spt: defines the spatial extensions namespace)

spt:distance is a function to return the shortest distance between two products in 3D space.

spt:distanceZ is a function to return the shortest vertical distance between two products.

spt:distanceXY shortest distance between the projections of two products on XY plane

Another example is finding all windows that are close (less than 300 units) to a column.

Implementation and testing

As a proof of concept, the extensions were implementedbased on the open source Jena ARQ query engine. An objective was to try to minimize low level coding to make the function implementations more portable and more transparent for public reviews.

As a practical example, a case study of building code compliance checking using the International Building Code 705.8.4 (IBC 2006) was tested successfully using a BIM model of a building with 4 floors.

The researchers report that the main limitations of the open SPARQL-based approach they have proposed relate to the way it was implemented. The implementation is not completely portable and performance could be improved, by integrating existing geometric libraries, for example. Future research will address these issues and investigate more use cases to extend functions for specific domains and to combine more data from different sources in the construction industry.

“Out of site and out of mind.” Many of the challenges faced by utilities in managing and operating buried infrastructure are encapsulated in this phrase. This session explores how software and technology can make our buried infrastructure more visible to all stakeholders in a utility – from engineers and maintenance crews ensuring assets deliver a required level of service to customers, through to planners and financial controllers responsible for delivering budgets and regulatory compliance.

October 15, 2016

200 countries have agreed in Kigali to reduce the emissions of hydrofluorocarbons (HFCs). HFCs are greenhouse gases used as an alternative to ozone-depleting chlorofluorocarbons (CFCs) in foam production, refrigeration, and air conditioning. When it was realized that chlorofluorocarbons were responsible for the ozone hole over Antarctica, nations got together in Montreal in 1987 and agreed to stop using CFCs. CFCs were replaced by hydrofluorocarbons for refrigeration and other applications and their use has been increasing by 10% per year. However, it was realized that HFCs are far more potent than CO2 in trapping heat and they have a lifetime of 14 years in the atmosphere. The Kigali agreement is an amendment to the Montreal Protocol on Substances that Deplete the Ozone Layer. The rapid growth of HFCs in recent years has been driven by a growing demand for cooling, particularly in developing countries with a growing middle class and hot climates. The Kigali amendment provides for exemptions for countries with high ambient temperatures to reduce use of HFCs at a slower pace. It is estimated that phasing out HFCs could prevent up to 0.5 degrees Celsius of global warming by the end of this century. UNEP Countries agree to curb powerful greenhouse gases in largest climate breakthrough since Paris

October 14, 2016

Methane has the second-largest global radiative forcing impact of anthropogenic greenhouse gases after carbon dioxide. In addition to anthropogenic sources, mainly fossil fuels, livestock and waste, natural methane sources include the biosphere (wetlands, termites, oceans, wildfires, and wild animals), volcanoes and geothermal emissions, and geological seepage where large quantities of natural gas migrate from shallow or deep rocks and reservoirs to the surface along faults and fractured rocks.

Estimates of emissions have come under increasing scrutiny. A recent study assessed the spatial distribution of anthropogenic methane sources in the United States by combining comprehensive atmospheric methane observations, extensive spatial datasets, and a high-resolution atmospheric transport model. It was concluded that the US Environmental Protection Agency (EPA) underestimates methane emissions nationally by a factor of 1.5.

The concentration of methane in the atmosphere stabilized from about 1999 to 2007. But here is evidence that since 2007 it began rising again. A recent study suggests that the more than 30% increase in U.S. methane emissions over the 2002–2014 period could account for 30–60% of the global growth of atmospheric methane seen in the past decade.

In a study just published the global methane budget and the contribution of the fossil fuel industry to methane emissions has been reevaluated. The ratio of carbon-13 to carbon-12 provides a signature which helps identify the source of methane. Both global methane and methane carbon isotope (carbon-13) records were used to compile what is believed to be the largest isotopic methane source signature database, including fossil fuel, microbial and biomass-burning methane emission sources. Total fossil fuel methane emissions from the fossil fuel industry plus natural geological seepage are 60 to 110 % greater than current estimates. After accounting for natural geological seepage, it is found that methane emissions from natural gas, oil and coal production and their usage are 20 to 60 % greater than current estimates.

October 13, 2016

As the electric power industry generates increasing amounts of renewable energy from intermittent sources such as wind and solar PV, the ability to nowcast radiance and wind velocity with km resolution up to 25 minutes in advance has become essential to balancing electricity demand and supply. This is the newest area where geospatial technology (wind at different altitudes, radiance, and temperature mapping) November 4 of this year NOAA and NASA will launch the GOES-R geostationary weather satellite which will provide support for nowcasting across the continental U.S..

GOES-R's Advanced Baseline Imager (ABI) will collect three times more data and provide four times better resolution and more than five times faster coverage than current GOES. The GOES-R series satellites will also carry the first lightning mapper flown from geostationary orbit. The Geostationary Lightning Mapper, or GLM, will map total lightning (in-cloud and cloud-to-ground) continuously over the Americas and adjacent ocean regions. GOES-R will provide coverage over a 1000x1000km box with a temporal resolution of 30 seconds, and spatial resolution of 0.5 to 2 km. It will completely scan the continental U.S. every 5 minutes, providing coverage of the 5000km (E/W) and 3000km (N/S) rectangle over the United States.

GEOS-R enables “nowcasting” of severe storms across the continental United States. GOES-R will fly the first operational lightning mapper flown in geostationary orbit. It will cover land and oceans. Total lightning (in-cloud, cloud-to-cloud, and cloud-to-ground) information from GOES-R will increase lead time for severe storm warnings because it will make it possible to observe areas of severe weather every 30-60 seconds. This will enable more reliable warnings and evacuations.