USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

Search

November 24, 2016

At the Year in Infrastructure 2016 conference in London, the Be Inspired Awards attracted over 300 submissions. Many of these were impressive in how they applied digital technology to designing, constructing and maintaining buildings and infrastructure. The winner of this year's Power Generation sector award was a remarkable full-lifecycle BIM project for a hydropower projects in a remote part of Yunnan, China.

Longkaikou hydro-power station is located in Heqing County, Dali Autonomous Prefecture, Yunnan Province. The primary focus of the project is generating electricity, but it also supports irrigation, water supply and flood control. The dam is a concrete gravity dam. There are five turbine-generator units with a total installed capacity a 1,800MW. Construction began in 2007 and in 2012 the reservoir began to fill. By January 2014, all power generation units were fully operational.

The Longkaikou hydropower station is the first hydropower station in China that fully employed 3D digital design technology from the start of the project. Design of the power plant began in 2005. More than 20 different disciplines were involved including geology, hydraulic, plant, water diversion, construction, road & bridge, metal structure, hydro-machinery, electrical, HVAC, and drainage.

PowerChina Huadong Engineering Corporation (Huadong Institute) was responsible for design of the project. It was the first institute in China's hydropower industry who migrated from traditional 2D CAD design to 3D collaborative design. The project employed digital model throughout the lifecycle. The project survey, design, construction and operation are all digital. This is considered a revolutionary technical innovation in the Chinese hydropower industry.

3D digital design greartly improved the efficiency of the design process by enabling collaborative design. Clashes between designs from different design teams were greatly reduced and product quality was significantly improved. For example, equipment clashes were reduced by 95%. The amount of on-site design change was reduced by over 80%. In the design of some equipment systems and the preparation of drawings, design time reduced by more than 50%.

The efficiency of collaboration between disciplines was significantly improved. Being able to view designs from different design teams in one unified model greatly reduces the time for review and signoff. In the past, it usually took more than one month to review and signoff a power plant's layout drawings. With 3D modeling, collaboration between disciplines improved design efficiency by over 40%.

The digital design platform made it possible to assess alternative design scenarios in order optimize designs. It is estimated that optimization of the dam's foundation surface construction based on a 3D geological model saved about CNY 194 million in engineering costs. The dam's foundation's is unique and complex and required innovative engineering because there was no previous model to follow. Using a 3D model of the geology (digital terrain model) made it possible to design and compare alternative construction scenarios to optimize the design. As a result of design optimization, the project was completed about 10 months ahead of schedule, creating economic benefits of up to CNY 1.79 billion.

With the use of 3D digital design and geo-coordination, the Longkaikou Hydropower Station project not only enjoyed a shortened design cycle, improved project and service quality, and reduced construction costs, but also provided the basis for digital operation and maintenance of the facility by the owner. Geo-coordination, a term minted by Bentley, is key to integrating data from many different sources including reality modeling, 3D building models and associated information. Bentley recognized the value of geo-coordination very early on. With all Bentley applications in the last 12 years every infrastructure asset has geo-coordination in the form of real world x y z coordinates. It is recognized as essential for full-lifecycle (design, build, maintain and operate) BIM. In this project it was fundamental not only for integrating the dam and hydropower plant design and the digital terrain model, but also for making it possible to reuse the BIM model and associated data during operation and maintenance.

The design was praised by both the construction contractor and the owner. It enhanced the core competitiveness and reputuation of the Huadong Institute, establishing a strong foundation for winning future hydropower projects.

November 23, 2016

At the Year in Infrastructure 2016 conference in London, the Be Inspired Awards attracted over 300 submissions. Many of these were impressive in how they applied digital technology to designing, constructing and maintaining buildings and infrastructure. The winner of this year's Utilities and Communications sector award was a remarkable full-lifecycle BIM project for a new substation in downtown Wuhan, a major city in central China.

The Miaoshan 220kV Secondary Transformer Substation is a newly built large indoor substation in Dongxihu District, Wuhan City, Hubei Province. The project integrated digital tools for construction, site preparation, mechanical, electrical, and protection. The 3D digital modeling approach enabled collaborative design of an indoor substation with restricted space with multiple voltages in a very congested urban area with minimal impact on the existing buildings and infrastructure and all of this within a tight timeframe.

The project faced severe challenges. Firstly, it was located in a congested area of downtown Wuhan and the substation was required to be very compact to minimize its impact on the surrounding buildings and infrastructure. Reality modeling the existing environment was essential because nearby buildings had to be taken into account in designing the site layout including feeder cables. Secondly, the substation is an indoor station, composed of a building complex with two floors above ground and and one basement floor. Because of the severely restricted space there was a high risk of clashes in a multi-discipline design environment, easily leading to problems of different teams placing equipment in the same space. Thirdly, the substation was intended to handle multiple voltages including 220kV, 110kV, and 10kV cables. Each voltage was handled by a different design team. Because many cables had to be routed into and out of the building in very restricted space, the risk of collisions and intersections among power control cables of different classes of voltage was high.

3D modeling was essential for this project. Because of the high risk of clashes and collisions between the designs from different design teams, a shared 3D model was essential to ensure a collaborative approach to design. The project used several different digital tools for different aspects of the project across disciplines, including electrical, construction, structure, general drawing, irrigation works, heating, ventilation, and site preparation. The electrical team used a substation design tool to perform the design of main wiring, electrical equipment layout, stress calculation and arrangement of wires, lightning protection and grounding, lighting and the layout of cables and supports. The construction team uses a building design tool to design walls, doors, windows, stairs, roof and louvers. The structure team used building design and structural engineering tools to design floors and columns. The master plan team used building design and site preparation tools to design the gate, enclosure walls, access road and the building site. Irrigation work, heating and ventilation programs used a digital building performance tool to complete air conditioner, axial flow fan, water and wastewater pipesdesigns.

The digital modeling approach made it possible for people from different disciplines to work in a collaborative design environment within a unified model space, which greatly reduced clashes between different designs. One of the important benefits of the 3D model approach was that minimized the impact of the substation on the existing infrastructure. Since the transformer substation is located in the downtown of Wuhan where buildings are dense, reality capture was used to model the area surrounding the substation. This helped to ensure the substation site design did not conflict with surrounding buildings. A major benefit of the 3D model approach was that it made it possible to design a substation that fulfilled the requirements, but avoided having to demolish four nearby residential apartment buildings. To enable combining the model, the point cloud data, the digital terrain model and other external data everything was geolocated (geo-coordinated).

Geo-coordination, a term minted by Bentley, is key to integrating data from many different sources including reality modeling, 3D building models and associated information. Bentley recognized the value of geo-coordination very early on. With all Bentley applications in the last 12 years every infrastructure asset has geo-coordination in the form of real world x y z coordinates. Geo-coordination enables engineers and facilities managers to find things that are tagged,related, or close to a building element and it makes it possible to navigate the information environment, not only 3D design models (virtual reality), but also 3D models in a real world context (augmented reality). It is recognized as essential for full-lifecycle (design, build, maintain and operate) BIM. In this project it was fundamental not only for integrating the substation design and its real world context, but also for making it possible to reuse the BIM model and associated data during operation and maintenance.The digital design approach reduced the project design time by 20 working days, verification time by 30 working days, and saved about CNY 50 thousand in design costs. According to project statistics, the collaborative approach enabled by digital modeling avoided more than ten potential cases of rework during construction, saving CNY 2 million. During the project review by external evaluators, the 3D digital model made it easier for the evaluators to understand the design intent, which played a decisive role in ensuring adequate communication of the design and implementation. This reduced the number of meetings required during the review. A side effect was that it enhanced the reputation of the Design Institute.

During construction, mobile devices were used in the field to convey details of cable placement to the construction unit. In addition, models showing the intended location of cables were converted into 3D PDF documents which were used in the field to ensure that the construction unit understood the designed cable placement. The result was that construction went smoothly, reducing construction time by 15 days.

Some of the key benefits were that reality modeling of the neighborhood enabled planning of the station site layout as well as site access that minimized impact on the surrounding buildings. The project uses 3D digital model and associated software made it possible for all disciplines to work collaboratively which significantly reduced the clashes between designs form different disciplines. 3D modeling made it possible to plan the arrangement of various types of cables a a very detailed level, which helped avoid crossovers and collisions and reducing fire risk and enhancing safety.

Perhaps most importantly for the total cost of the project including operations and maintenance, the model and associated data was brought into an FM tool that made it possible efficiently manage plant data, technical specifications, operation and maintenance manuals and other detailed information for operation and maintenance management of the substation after construction.

November 17, 2016

The International Energy Agency (IEA) has issued its annual World Energy Outlook 2016. The IEA has examined several scenarios. All the Paris COP21 climate pledges from 190 countries have been incorporated into its main scenario. (It does not include any assessment of what the Trump administration may do with respect to the COP21 committments made by the world's second largest emitter.) It has projected that this scenario will actually result in 2.7 °C of warming. A scenario, called the 450 Scenario, with a 50% probability of limiting global warming to 2 ° C was also examined.

In its main scenario which includes emission reductions from COP21 pledges, the IEA projects a 30% rise in global energy demand to 2040. In 2040 it is projected that 60% of all new power generation capacity will come from renewables. In 2040 renewables are projected to represent 37% of all power generation. By 2040 most renewables-based generation is projected to be competitive without subsidies. Solar PV is expected to see a further 40-70% reduction in cost by 2040 and onshore wind by an additional 10-25%. However, fossil fuel generation continues to be important. Natural gas consumption is projected to rise by 50%. Growth in coal use essentially stops while growth in nuclearpower is limited to China.

Energy-related CO2 emissions plateaued in 2015 mainly because of a 1.8% improvement in the energy intensity of the global economy. This is partly because a growing proportion of the $1.8 trillion currently invested in energy annually is directed to renewables. But the IEA projects that even with the COP21 commitments energy-related emissions will continue to grow by 0.5% per year.

It is projected that a cumulative $44 trillion in investment will be needed in the global energy supply over the next 15 years. 60% of this will go to oil, gas and coal extraction and supply, including power plants using these fuels, and nearly 20% to renewable energies. Another $23 trillion will be required for energy efficiency improvements. For comparison over the last 15 years, 70% of total energy supply investment went to fossil fuels. This scenario requires a significant reallocation of capital from fossil fuels to renewables. This is aided by dropping fossil fuel subsidies. In 2015 the value of fossil-fuel consumption subsidies dropped to $325 billion, down from $500 billion the previous year.

The IEA projects that the COP21 commitments will slow the projected rise in global energy-related CO2 emissions. However, it does not believe that this is enough to limit warming to less than 2 °C. The IEA's 450 Scenario, which is based on a 50 % probability of limiting warming to under 2 °C, requires a further major reallocation of investment capital going to the energy sector. A greater proportion of the $40 trillion in cumulative energy supply investment has to move from fossil fuels and towards renewables and other low carbon investments in nuclear and carbon capture and storage so that the share going to fossil fuels would drop to about a third. In addition, $35 trillion is needed for improvements in energy efficiency - that's $12 trillion more compared with the main scenario. The 450 Scenario implies that before the end of this century, the energy sector must become carbon-neutral. That means all residual emissions from fuel combustion are either captured and stored, or offset by technologies that remove carbon from the atmosphere.

In the U.S. electricity regulation is the responsibility of the states and there is a wide variation in regulation of the emerging electric power markets. It is apparent to virtually all state regulators that under the impetus of distributed energy the utility industry is having to change its business model. Because there are many different business models with which the 3,500 or so utilities are trying to meet the challenge of distributed energy, many state regulators are finding it a challenge to create a new regulatory framework for this new energy world.

In the 1990s and early 2000s a number of states decided to deregulate their electric power markets to encourage competition. Typically this meant requiring incumbent utilities to divest their generation facilities and become regulated transmission and distribution entities. Deregulation encouraged independent power retailers to enter the energy market. It was hoped that deregulation would reduce costs and provide consumer choice. In restructured states utilities are barred from owning power plants so that a competitive energy market could emerge. Private energy providers have argued that the same rules should apply to distributed energy, so that utilities do not use their market power to dominate the renewable energy market.

According to the Edison Electric Institute in 1995 generation accounted for about two-thirds of the price of electricity. Today, the cost of generation is less than half of the price of electricity. In many parts of the country the wholesale price of electricity has dropped significantly while the retail price has increased - in other words, electricity is cheap, the grid is expensive. This has resulted in disintermediation where large industrial and commercial customers in states with high regulated rates have shifted their power purchases to lower-cost energy providers or have decided to self-generate.

There is a lot of diversity among the alternative business models and regulation that are being adopted by the 3,500 utilities in the U.S. At the SGIP conference, California, New York, and Texas were the states that were most often mentioned.

California

Utility-owned photovoltaic installations account for 52 percent of the solar capacity installed in the United States, according to the Solar Energy Industries Association. In California Southern California Edison (SCE) and PG&E have invested in utility-scale solar power plants, but they buy most of their renewable power from plants owned by private investors. For example, the Ivanpah solar plant is owned by private investors including Google, and both PG&E and Southern Cal Edison have long term commitments to buy the plant's power. California also encourages consumer rooftop solar through a net metering program that pays retail rates for solar PV power generated by consumers.

Sacramento Municipal Utilities District (SMUD) is one of the most forward-looking utilities in North America. It was the first California utility to reach 20% renewable power and the first to commit to 33% renewables. SMUD is moving from a centralized utility with a business model based on selling electricity to a distributed utility providing localized grid services. SMUD is getting out of the business of selling electricity, and into the business of selling grid services.

One of the most interesting innovations at SMUD is in the area of rate structure. SMUD and other utilities are in the interesting position for a retailer of trying to sell less of its product, in this case electricity. People are using less electricity as a result of personal interest as well as SMUD's own energy efficiency programs. But the money has to come from somewhere. SMUD has introduced a flat infrastructure fee. Currently every customer pays $18/month for the grid, independent of how much power they consume. SMUD has determined that $28 is the breakeven point, where the cost of maintaining the grid would be covered by infrastructure fees, and is moving towards that monthly change.

Another interesting innovation at SMUD is its solar power program. Customers have the option to buy into solar power without having to install solar panels on their roofs. The first solar program was a 1 MW solar PV program designed for residential customers. The next was the recently opened 11 MW program for commercial customers. The new solar plant at Rancho Seco Solar Power Plant is owned by D. E. Shaw Renewable Investments. Power purchased from the project will provide energy for SMUD's commercial SolarShares® program. The SolarShares program is designed to provide business customers with a low-cost solar alternative to installing solar panels onsite.

Texas

In 1999, the Texas Legislature began restructuring the state’s electric industry which allowed consumers to begin choosing their retail electricity provider. As of Dec. 31, 2001, deregulation of the ERCOT electricity markets required investor-owned utilities to unbundle their operations. The provision of service to end-use retail customers became competitive. There are over 100 retail electricity providers of which more than a quarter offer a 100% renewable plan to their customers as an option. Since 2002, approximately 85% of commercial and industrial consumers have switched power providers at least once. Approximately 40% of residential consumers in deregulated areas have switched from the former incumbent provider to a competitive retail energy provider. One 2011 study found that Texas customers paid up to one-third less in 2010 compared to 2001 before competition and energy deregulation.

In 2007 TXU, the largest utility in the state, was acquired by a consortium and broken up into electricity distribution called Oncor Electric Delivery, power generation called Luminant, and an electricity retail provider TXU Energy without any electrical distribution or generation assets.

At the SGIP conference Cheryl Mele, Senior Vice President and Chief Operating Officer at ERCOT, pointed out that wind penetration has at times provided 48% of Texas power demand. Over the past year wind provided 12% of all power generation. Some electric utilities in Texas, most of whom participate in the ERCOT grid, invest in renewable energy generation. For example, NRG Energy, an integrated energy company, owns wholly or partially 1.1 GW of wind and solar power.

New York

The state that is getting the attention of state regulators is the New York Public Services Commission (PSC). In the United States, a number of states have restructured their electric power industry with the goal of opening what has been a monopoly market to open competition. In New York, the regulator PSC has been aggressive in changing regulation ("Reforming the Energy Vision" or REV) to promote a competitive electric power market.

The New York Public Service Commission (NYPSC) has taken a major step toward a future of transactive energy. New York's "Reforming the Energy Vision" (REV) is moving the New York state electric utility industry toward a radical redefinition of utilities as we have known them over the past hundred years since Tesla and Edison created the electric power industry. In the future in New York State the utility industry will be comprised of distributed system platform (DSP) providers, basically providing the grid but not directly selling energy, and a competitive energy market with consumers and many energy providers including bulk power generators and distributed energy generators including you and me with rooftop solar panels. This model has similarities with the UK model which is disrupting the traditional energy market in the UK..

In many of these states in response to policies promulgated by state regulators utilities in are increasingly looking at “non-wires alternatives” to address system overloads. Non-wires alternatives include distributed energy generation (DER), demand response, energy efficiency and other non-traditional means to address system overloads instead of building conventional transmission and distribution infrastructure. State regulators are asking utilities to use non-traditional solutions to address grid problems and in some cases utilities are asking the competitive market to come up with ideas.

For example, ConEd and New York State Electric & Gas (NYSEG) circulated solicitations for non-wire alternatives. ConEd is interested in demonstration projects to determine whether utilities can derive significant market-based benefits from integrating energy storage (utility-scale batteries) onto their electric systems. ConEd has suggested that storage could reduce operating expenses, reduce peak demand for generation, manage distributed energy resources, or provide customer benefits, such as power quality or reduced risk of outages. New York State Electric & Gas has issued a request for proposals for non-wires alternatives including distributed generation, demand response, energy efficiency, and energy storage that can reduce peak loading on a transformer bank. The utility hopes to defer an expensive upgrade to one of its substations. In both cases the utilities have excluded fossil fuel proposals, except those that use natural gas or propane.

But if these solicitations involve distributed energy or storage, the PSC will only allow utilities to own distributed energy sources under very restrictive circumstances.

The utility issues a solicitation, but the market response is inadequate or too expensive

A project consists of energy storage on the utility's property and integrated into the utility's distribution system

A project will enable low or moderate income residential customers to benefit from the resource where markets are not likely to satisfy the need

A project is being sponsored for demonstration purposes.

For example, Consolidated Edison (ConEd) had an electrical overload problem in Brooklyn and Queens. Instead of building a new substation estimated to cost $1 billion, it proposed a non-wires solution involving fostering the development of local energy sources, demand management and energy efficiency. As part of the solution, ConEd planned to install battery storage on its own property. In this special case the commission approved the storage ownership by ConEd because it would be on the utility’s property and integrated into its system. But it reiterated that it will only allow utilities to own distributed energy resources when the open market does not offer a solution.

Midwest

In Illinois in 1997 the Customer Choice Act mandated that state utilities could no longer own generation facilities. During the Mandatory Transition Period 1997–2007, utilities were required to sell their electricity generation assets to other energy companies. Utilities no longer sold power and became responsible for delivering electricity only. Since 2002 in the deregulated market, customers were permitted to buy electricity from competing retail electric providers. In 2006 the Retail Electric Competition Act encouraged residential and small business customers to switch to alternative electric providers.

in July 1999, Ohio restructured its energy market to give consumers choice with their energy provider. The law took effect on January 1, 2001. Customers could now choose to buy energy from retail electric suppliers instead of automatically receiving it from the utility company (primarily AEP Ohio, Dayton Power & Light, Duke Energy, and FirstEnergy) in their area. In May 2008, each incumbent utility was required to shed its power generation operations and become a purely electric distribution utility.

FirstEnergy, headquartered in Ohio and one of the largest investor-owned electric utilities in the U.S. with customers in Ohio, Pennsylvania, West Virginia, Maryland, New Jersey, and New York, has just annnounced a strategic decision to exit the competitive power business not just in Ohio but in all its service territories. FirstEnergy is getting out of the business of generating and selling power and plans to become purely a regulated transmission and distribution entity.

Southeast

In the Southeast Duke Energy is a vertically integrated energy company with generation, transmission and distribution assets. Duke Energy owns, operates or maintains renewable energy assets in its regulatory service territories and through Duke Energy Renewables.

Since 2012, the Southern Company system has added or announced more than 4 GW of renewable generation. Southern Power owns more than 1,600 megawatts of solar generating capacity at 26 facilities operating or under construction in California, Georgia, Nevada, New Mexico, North Carolina and Texas. Seventeen of these facilities are co-owned by third parties with Southern Power having the majority ownership. Georgia Power, Southern subsidiary, is a national leader in solar energy. Georgia Power began offering rooftop solar to customers in July of 2015. Alabama Power, another Southern subsidiary, is currently constructing two 10.6-megawatt solar facilities.

Florida has had a law for nearly a century that gives utilities a regional monopoly on power production. In some states solar companies like Solar City can install panels for free and then sell the power to the home or business owner at a rate lower than local utilities. These third-party sales are generally illegal under the Florida law that gives utility companies a local monopoly on supplying power.

Nevada

The Nevada Public Utility Commission has just re-established its previous, favorable rates for about 32,000 customers that already own rooftop solar panels in Nevada. These solar customers had been facing a significant rate hike. However, new solar customers will face the new rates that were changed in December. Solar City and other solar providers have said that the new rates make it uneconomic for them to do business in the state. NV Energy is developing its own solar energy sources, often together with large customers who have requested solar energy. Three casinos opted to leave the grid, but the exit fees for leaving the NV Energy grid are so exorbitant that one of the casinos (Wynn) is opting to contest the fee in court. Another casino opted to collaborate with NV Energy in the development of a solar PV farm. The other casino (MGM) has opted to pay $86 million to leave the grid and rely on its own huge rooftop solar array on the Mandalay Bay and to buy the remainder of its power needs on the open market.

Summary

It should be clear that there are a lot of different approaches to creating, regulating and adopting a new business model among the 3,500 or so electric power utilities in the U.S. A consensus on the way forward for U.S. utilities has not been reached and it appears that the different regions will likely end up with different business models and different regulatory policies. It is clear that that whatever the business model and regulatory framework that are ultimately adopted in different regions, they all will have address the big issues of increasing customer choice, a growing array of consumer, commercial and industrial applications (hardware and software) that interact with the grid, distributed energy, microgrids which may or may not be connected to the grid and transactive energy. Gartner has forecasted that by 2020 the largest electric power utility will be an Uber-like entity that will neither own infrastructure nor generate electricity, but will manage energy flows by putting energy producers together with energy consumers.

November 09, 2016

Anne Pramaggiore, President and CEO of ComEd of Chicago, was the first at the Smart Gird Interoperability Panel 2016 Grid Modernization Summit in Washington DC to call the current business transformation that the electric power industry is experiencing a revolution. But many other speakers agreed with this characterization. Frequently mentioned were the regulatory and utility business transformations occurring in California and New York (REV). Sacramento Municipal Utility District (SMUD) was mentioned as an example of a utility that was aggressively driving change in the utility business model.

Robert Wilhite of Navigant Research pointed out that their recent research had found that 90% of utilities recognize that utility business models have to change. But there are different opinions about the urgency of this transformation. Navigant found that about half believe that it has to happen immediately, while the remainder believe it can happen later.

Of course none of this would be possible without the enabling technology of the smart grid. Dave Velazguez, President and CEO of Pepco, now part of Exelon, recollected that the first smart grid development at Pepco, by no means a laggard in adopting new technology, occurred only a decade ago.

Distributed energy

Virtually every speaker singled out distributed energy (DER), especially solarPV, as the main motivating driver that is forcing the business transformation of the utility industry. Originally seen as a response to a warming climate, solarPV has achieved grid parity in several parts of the country and is seen as a way of providing customers not only with a choice of a green energy source but increasingly as a less expensive alternative to the local utility. The price drop has been so substantial that even at a time of unusually low natural gas prices that are responsible for closing many coal-fired generation plants solarPV is achieving grid parity in many parts of North America. Another important contributor to the revolution is the electrification of transportation, not only EVs but also buses, rail and even trucks.

But none of this could have happened without the enabling platform of the smart grid. The conjunction of technological advances in packet switched wired and wireless communications networks, intelligent electronic devices underlies the DER advances that have created choice for customers. As we have seen previously in other industries such as telecom consumers have grasped the opportunity and are now beginning to drive changes in the energy market.

Microgrids

Another topic that came up repeatedly was microgrids. Duke Energy has been running microgrid pilot for several years from which emerged the OpenFMB standard with an OpenFMB working group to support the effort. ComEd has five microgrid pilots and is starting the country's first microgrid cluster. Pepco has two microgrid projects underway.

At the Distributech 2016 and at this SGIP Summit an implementation of OpenFMB demonstrated live by Green Energy Corp. It was also announced at the summit that the source code was now available for download from the openFMB.io site.

In addition a number of vendors were offering solutions for managing microgrids including the KEPCO Consortium from Korea who has built a comprehensive microgrid solution.

Geospatial

Although this was not a geospatially-focussed conference, GIS and geospatial technology came up in several areas.

Chanda Visweswanah, an IBM fellow, outlined in three topics what the the challenge is in transforming the grid. First, there are several trends that are behaving exponentially, the number of sensors, the volume of data, and renewable energy sources were among them. Secondly, we are increasingly relying on intermittent energy sources like wind and solarPV. Third, this is not you grandmother's software. This is software that provides a platform that supports sophisticated mathematical algorithms. One of the most important of these are algorithms that nowcast and forecast weather at a very high resolution so that it is possible to predict wind velocity not at ground level but at the elevation of the turbines and at the location of the wind farm. Similarly for insolation for solarPV. These wind and insolation maps are effectively are the generation forecast minutes and hours in advance.

The Kepco Consortium demonstrated the μ-Grid platform. Kepco have already partnered with PowerStream in Ontario to provide a utility-scale microgrid. Earlier this year PowerStream and Kepco officially launched a utility-scaled Microgrid, which has the capacity to provide several hours of backup power supply for approximately 400 customers residing or owning a business in Penetanguishene. The Microgrid is helping to reinforce the existing grid by increasing resiliency and operational flexibility. The platform provides a GIS Studio which allows grid 35 types of power facilities such as transformers to be visulaized and edited on a map, either imagery form satellites or overflights or a topographic map.

Another application of geloocation is in the area of cybersecurity where location is used by advanced cybersecurity applications for verifying identity.

Distributed intelligence

Several speakers pointed out that the volume of data has simply become to large for a centralized IT system that collects all the data at a central control point where analysis and visualization is performed. By pushing intelligence out into the field, it reduces the data volumes that have to be handled at the central station and reduces the data flow over the network.

Standards

Orange Button

Based on the very successful Green Button program, an important standards initiative is Orange Button. It was initiated by the Department of Energy (DOE) with the objective of making it easier for investment to flow into solar power by standardizing electronic reporting of solar installation performance. Orange Button targets a reduction in soft costs by streamlining the collection, security, management, exchange, and monetizing of solar datasets across the value chain of solar. It is expected that creating an industry-driven standardized data landscape will facilitate the growth and expansion of distributed solar. As the solar market continues to expand, it is critical that the collection, management, and exchange of solar datasets – especially those that affect the bankability (ability to secure loans for solar installations) of solar assets – are coordinated and streamlined to protect consumers, increase efficient pricing, and support new and existing businesses entering the solar marketplace. In April of this year four awardees (kWh Analytics, National Renewable Energy Laboratory, Smart Grid Interoperability Panel, and SunSpec Alliance) were announced by the Department of Energy that will collaborate on the overall goal to standardize the way solar data is collected and exchanged. SGIP was chosen to lead the effort.

At SGIP the effort is being led and coordinated by Dixon Wright of Wells Fargo. Dixon is the only person that I was aware of representing a financial institution at the SGIP event. I think that the presence of a major financial player such as Wells Fargo is a strong indication that solar is moving into the big leagues of financing. Dixon is passionate about the Orange Button initiative and believes that once a draft standard and implementation becomes available and a few users champion it, it will catch on rapidly with the large financial institutions and the solar industry.

OpenFMB

In March 2015, the Smart Grid Interoperability Panel (SGIP) formally kicked-off its OpenFMB effort to foster field interoperability drawing upon Duke Energy’s previous Coalition of the Willing I and Distributed Intelligence Platform work. By the 2015 SGIP Annual Conference in November, three microgrid use cases had been developed and demonstrated: optimization, unscheduled islanding, and grid reconnection.

In February 2016, the first OpenFMB reference implementation, utilizing SGIP’s microgrid use cases at Duke Energy’s Mount Holly microgrid test center, was demonstrated at the 2016 DistribuTECH Conference by Duke Energy’s Coalition of the Willing II (COW-II) vendor partners. For the annual SGIP Grid Modernization Summit in November 2016 the SGIP OpenFMB task force published a series of related foundational use cases that center on Distributed Energy Resources (DER) Circuit Segment Management for the active coordination of power systems equipment to DER, including a microgrid. Green Energ Corp demonstrated an OpenFMB implementation connecting a solar PV installation with other equipment and applications. This is an open source project and it was announced at the SGIP event that the source code is now available and can be downloaded from openfmb.io. There is a free webinar replay "SGIP Grid Modernization Summit OpenFMB Demo Preview."

Interoperability

Several speakers were on hand from the Department of Energy's Gris Modernization labs. Since this was an SGIP event they focussed on the research underway at 13 Federal labs and 150 partner organizations that involves interoperability. First of all research is underway to find a way to measure interoperability. Secondly, there is an important initiative underwritten by several million dollars of financing to determine how to value interoperability. If we can quantify the contribution it makes to business efficiencies by enhancing the opportunity for competitve products, by creating standards-based platforms that intelligent products from different vendors, whether smart meters, synchrophasors, reclosers, or substation components can plug and play with, that ultimately reduces prices for utilities, then it will be easier to justify and priorize greater investment the standards and implementations that provide interoperability.

October 13, 2016

As the electric power industry generates increasing amounts of renewable energy from intermittent sources such as wind and solar PV, the ability to nowcast radiance and wind velocity with km resolution up to 25 minutes in advance has become essential to balancing electricity demand and supply. This is the newest area where geospatial technology (wind at different altitudes, radiance, and temperature mapping) November 4 of this year NOAA and NASA will launch the GOES-R geostationary weather satellite which will provide support for nowcasting across the continental U.S..

GOES-R's Advanced Baseline Imager (ABI) will collect three times more data and provide four times better resolution and more than five times faster coverage than current GOES. The GOES-R series satellites will also carry the first lightning mapper flown from geostationary orbit. The Geostationary Lightning Mapper, or GLM, will map total lightning (in-cloud and cloud-to-ground) continuously over the Americas and adjacent ocean regions. GOES-R will provide coverage over a 1000x1000km box with a temporal resolution of 30 seconds, and spatial resolution of 0.5 to 2 km. It will completely scan the continental U.S. every 5 minutes, providing coverage of the 5000km (E/W) and 3000km (N/S) rectangle over the United States.

GEOS-R enables “nowcasting” of severe storms across the continental United States. GOES-R will fly the first operational lightning mapper flown in geostationary orbit. It will cover land and oceans. Total lightning (in-cloud, cloud-to-cloud, and cloud-to-ground) information from GOES-R will increase lead time for severe storm warnings because it will make it possible to observe areas of severe weather every 30-60 seconds. This will enable more reliable warnings and evacuations.

October 12, 2016

There is little doubt that "transactive energy" is in the future of electric power in North America, which already has continental grids, and in the EU with the development of the EU's supergrid. The world is rapidly developing the technology that allows energy flows to be determined by market forces (transactive energy). By analogy with Uber and Airbnb, Gartner has predicted that by 2020, the largest energy company in the world (by market cap) will not own any network (grid) or generation assets. It will manage information about energy sources and consumers. Another option is a decentralized,distributed transaction system that allows generators such as you and I as well as traditional large scale utility generators to sell energy to you and me and other users, but without a centralized transaction manager.

Blockchain is a decentralized, distributed database for managing transactions. Blockchains provide a global infrastructure untethered from the stability or permission of governments and institutions. It is the technology underlying Bitcoin. The generalized version is often referred to as blockcoin 2.0. Blockchain supports smart contracts. Nasdaq and other financial institutions including banks have been experimenting with and developing applications based on blockchain technology. Nasdaq has opened its blockchain services framework to more than 100 of its market operator clients around the world. The blockchain services are part of what Nasdaq calls its "Core Services" component which serves as a hub for its financial applications. Late last year an issuer was able to use Nasdaq Linq blockchain technology to successfully complete and record a private securities transaction - the first of its kind using blockchain technology. Chain.com, a Nasdaq Linq client and blockchain developer, documented its issuance of shares to a private investor using Nasdaq's blockchain-enabled technology.

Earlier this year Nasdaq demonstrated a blockchain service that lets solar power generators sell energy certificates (Nasdaq Explores How Blockchain Could Fuel Solar Energy Market). The project is part of a collaborative effort between Nasdaq and design firm IDEO's CoLab to facilitate partnerships that take a design-based approach to creating human-usable technology. Internet connected solar panels have been developed with technology provided by Filament, a Nevada-based blockchain startup. Its technology allows traditional electronic devices to be connected online. The solar panel is actually hard-wired into the IoT device through a converter which enables Nasdaq to measure the wattage the panels are generating into the grid. Using the API of Nasdaq's blockchain-based private markets platform, anonymous certificates are created which can be sold to anyone who wishes to subsidize solar energy. The cryptographically verified certificate, representing solar power generated in the western US, appeared live on the screen in New York City.

Nasdaq is not the only organization exploring blockchain for energy transactions. Brooklyn-based startup LO3 has partnered with Consensus Systems on TransActive Grid which is combination of ​software ​and hardware ​that enables members to buy and sell energy from each other securely and automatically, using smart contracts and the blockchain.

In Vienna, Grid Singularity is using blockchain to support energy-related transactions. "Damit haben wir (fast) alles beisammen, was in der neuen Energiewelt relevant ist oder kurzfristig wird: dezentrale Energieversorgung, Digitalisierung, disruptive Geschäftsmodelle, Smart Contracts, Blockchain, Kryptowährung. Als Wegweiser in die neue Energiewelt, haben wir am 23. Mai 2016 den 1. Blockchain-Tag in Deutschland veranstaltet." (We've just about got everything we need for the new energy world, decentralized energy distribution, digitization, disruptive business model, smart contracts, blockchain, and cryptographic security. As a signpost to the new energy world, we have instituted May 23, 2016 as the first Blockchain Day in Germany.)

Australian startup Sun Exchange has developed a platform that lets investors back small-scale solar projects and receive monthly dividends.

MIT start-up SolarCoin (SolarChange in the US and EU) pays people with an alternative digital currency for generating solar energy. SolarCoin is based on Bitcoin technology. The original concept for SolarCoin required a central bank. Blockchain is decentralized making the central bank unnecessary. People earn coins as a reward for generating solar energy. People with solar panels on their house receive solar renewable energy certificates from their energy company in return for feeding a megawatt-hour of electricity back into the grid. These certificates are already traded for cash, but if you present them to SolarCoin you will get a coin. SolarCoins are now given in 23 countries. The SolarCoin market cap is currently estimated to be about $2 million.

July 05, 2016

Currently geospatial data and technology are used tactically by electric power utilities for many purposes including outage management, vegetation management, disaster management, renewable energy facility siting, universal electrification planning, asset management, and energy density mapping, to name just a few. But the smart grid is fundamentally transforming the role of geospatial data and technology in electric power utilities. According to a report from Navigant Research “The smart grid is all about situation awareness and effective anticipation of and response to events that might disrupt the performance of the power grid. Since spatial data underlies everything an electric utility does, GIS is the only foundational view that can potentially link every operational activity of an electric utility including design and construction, asset management, workforce management, and outage management as well as supervisory control and data acquisition (SCADA), distribution management systems (DMSs), renewables, and strategy planning.” In other words the smart grid makes geospatial strategic. Location is the basis on which data is organized and spatial analytics is how information is extracted from the data. As Bradley Williams has so succinctly put it, "spatial analytics is becoming a key technology for electric utilities because everything a utility does - customers, assets, and operations - involves location."

At the Future of Canada's Utilities Summit in Toronto I had the opportunity to chat with Alexander Bakulev, VP of Strategy and Assets at METSCO, in Mississisaug, Ontario about asset management, location and GIS, and the future of utility IT. Alex has deep experience with asset management at utilities in Ontario and as VP of Strategy he brings a forward-looking perspective to the changing role of location and GIS in asset management in electric power utilities.

Alex, what type of asset management service does METSCO provide to your customers ?

We provide a framework for asset management based on a methodology, risk-based asset management. When a decision needs to be made such as whether to replace an asset or maintain it and run it to failure, those decisions are made by engineers to mitigate the risk of failing as well as for safety and environmental reasons. We enable our customers to translate those risks into dollars so they can compare dollars to dollars and make an economic decision whether to replace or maintain an asset. Of course, this has to be done within a regulatory environment. In Ontario this means PSAB 3150 and ISO 55000.

Can you give a perspective on how advanced your customers are with respect to asset management ?

Asset data

We have a maturity hierarchy that helps us understand where a utility company is in managing its assets. When we begin working with a customer, the first thing we look at is their asset data, what do they have, in what media, and where it resides. Companies have different maturity in term of the asset records they have and how they manage them. Usually what we see is that companies may have good inspection data, but the information is stored in a decentralized mode - paper documents, Excel spreadsheets, special lab-testing software files, and so on - but not in a centralized system. In many cases we have to compile paper documents and files to order to combine it all into a centralized asset management system. But we see that is changing. Companies are starting to invest in document management systems and capturing data digitally. They are also collecting data with mobile devices in a way that goes directly to the online ERP system. In addition newer equipment is smarter and able to report inline so it's a paperless process.

Risk of failure

Once you have the basic information about the condition of each asset, the next step is to try to predict each asset's failure probability. We want to understand what's the chance that they will fail next year, after 5 years, or 10 years. If the company has historical data on failures including the type of asset, when it failed and why it failed, we can use that information to statistically predict failures. To do this we first find data about which assets have been replaced. Then we try to determine the reasons why the assets were replaced. Often this information is even less available than condition information. With this information we can estimate the expected failure rate for existing assets.

Consequence of failure

Once you know the risk for every asset to fail, the next step is to look at the consequence of failure. We not only look at what is being fed downstream from an asset such as a transformer, but also at the protection schema on the feeder. For example, when a radial feeder fails, the result is a minimum of 4 hours outage. If it the feeder is a loop design, the outage may only be 30 minutes. If the feeder is automated, the outage could be seconds. And it may not be just power that customers lose, there could also be environmental and safety issues. For example, if a corner pole fails, the consequences are more serious than a pole in a street line. There are also financial factors. How fast do you need to replace the asset ? Do you keep it in stock or order it ? For expensive assets, it is very expensive to keep spares in a warehouse.

Monetized risk

Once we know the condition of each asset, the probability that it will fail, and the consequence of failure, we can monetize the risk. With monetized risk, when we have to make a decision, we make an economic decision. Maybe it's better to run to failure, or increase the frequency of maintenance to watch the asset closer and accept the risk of failure for the next 5 years and after that replace it, or if it is high risk, replace the asset immediately.

It sounds like you can classify utility companies based on the data they have and how they use it to make decisions ?

We have a maturity model that helps us determine where a customer is so we can determine how best to help them.

Run to failure - Assets are only replaced when they fail. From the perspective of financial costs, this alternative looks good, but if you look at customer value, the cost is very high. There are many unplanned outages of relatively long duration.

Age-based - In this case assets are replaced based on the age of each asset. In this case investment requirements can be three times higher than run to failure, but the customer is benefiting by improved service. There are much fewer unplanned outages.

Condition-based - With this approach you only replace assets that are in poor condition. Investment requirements are less than age-based, but still quite high compared to run to failure. Customer value is similar to age-based.

Risk-based - For example, you may decide to run 70 % of assets to failure in those areas where there is low risk, but for the 30% of assets that are high risk, you try to avoid any failures. This requires focussed investments. Investment requirements are significantly lower than condition-based, but still higher than run to failure. But with this alternative, customer value is optimal within the budgeted investment level.

Where are most of your customers also in the maturity model when you first start working with them ?

When we first begin to work with them, most are somewhere between age-based and condition-based. Many are making age-based decisions, but are trying to move to condition-based asset management for many asset classes. Surprisingly there are some large companies that still follow a run to failure approach. We have customers that have gotten to the highest level, risk-based asset management, and we have helped them get there.

What size utility finds it easiest to do this ?

In our experience, mid-sized utilities - with half a million to a million meters - find this transition easiest. They are more flexible. For large utilities the transition is challenging because it they have to change what they have been doing for the past ten years, mostly age and condition-based asset management, and they have implemented large expensive systems to help them do that.

How does location and GIS fit onto this ?

Location and GIS are fundamental in several areas of asset management. First of all geospatial technology provides a communication tool. Most critically geospatial visualization is used to justify investments. It makes it possible to show stakeholders high risk areas. It makes it possible to show people recommended investments and where to invest.

Secondly, it helps to plan for the future. By visualizing changing demographics it helps identify what parts of a city require investment. For example, if in the next 10 years most of the work is going to be done in a particular rapidly growing area of the city, it can be planned for by creating a yard in the area, for example.

We also use GIS for core planning. When engineers are designing the electric power network for a new subdivision, they require very detailed information about existing electric power assets in and near the area. This information is collected from many different sources and visualized on a map. The engineers can look at the risk profile of every existing asset on the map and start planning which assets need to be replaced immediately, which in five or ten years, and where they need to install new equipment.

Another critical area for GIS is connectivity, Connectivity is about which customers are connected to which service point, which service points are downstream from each transformer, and which transformers are on each feeder. Connectivity information is essential because it enables utilities to know who is affected by a planned or unplanned outage. It also provides a link between customers and infrastructure for financial planning purposes. Many utilities don't have reliable connectivity information or they think they do but really don't. GIS is the primary tool that we use to establish or correct connectivity.

Another critical area that is growing in importance for GIS is spatial analytics. GIS can help estimate the condition of assets and identify high risk assets. For a simple example, in a tabular ERP system, all poles are the same, but in the GIS we can see some poles are corner poles which are higher risk. As another example, if an asset is near a highway which is salted in winter, this can affect the condition of the asset. Here we are getting into predictive analytics where we can make estimates based on proximity to things that may affect the expected lifetime of assets. In this case assets near roads with salt in winter are at higher risk.

What's your perspective on the future for utility asset management ?

Many companies are beginning to see the advantage of the mapping environment for asset management. In the past asset management was pretty much done with Excel or similar table-based tools. What we are seeing now is that increasingly decisions are being made based on maps, increasingly interactive maps. There are several drivers for this.

The first driver is simplification. Asset management is simpler when you put everything on a map. And for the new generation of engineers, maps facilitate knowledge transfer - it 's easier for them to design, manage and maintain the grid with assets visulaized on a map.

The other driver is grid complexity. With the smart grid and distributed energy generation, the grid is getting much more complicated. There is a lot more data that needs to be analyzed. Thefe are also increasingly complex reporting requirements from the regulator. There should be one environment that can handle all of this - data management, analytics, and regulatory reporting. Maps help alleviate complexity and achieve simplification.

Do you foresee in the future that utility applications such as outage management, customer information system, crew management and dispatch systems, and others will be best of breed or an integrated solution from a single vendor ?

The future as I see it is all about map-based decisions - where are the outages, where are our marketing opportunities, where are assets likely to fail, where do our employees live, where are the customers that have lost power, where should we position crews in preparation for a storm ? To enable this to happen, I believe the future for utility IT is a map-based platform which applications like asset management, design, outage management, and disaster management from different vendors can plug into and share information in a shared geospatial environment.

June 17, 2016

The U.S. Energy Information Administration (EIA) has issued a pre-release version of the Annual Energy Outlook 2016 (AEO2016). The pre-release includes the Reference case and the No Clean Power Plan case. The Reference case includes existing legislation and regulation including the Clean Power Plan (CPP). The Clean Power Plan has been mandated, but its implementation is being held up because it is being challenged in court. The No Clean Power Plan case assumes that the Clean Power Plan is not implemented.

The AEO2016 projects that CO2 emissions from electric power generation, which have declined over the past decade, will continue to decline. In the Reference case, CO2 emissions in the power sector are projected to drop to 35% below 2005 levels in 2030 as a result of the implementation of the CPP. Increased generation from renewables and the replacement of coal-fired by gas-fired generation are the primary reasons for lower emissions. There are a number of drivers including continuing low natural gas prices, state-level renewable portfolio standards, federal tax credits for renewables, and tighter environmental regulations. If the CPP is not implemented, the EIA projects that emissions will rise slightly but remain 19% below 2005 levels through 2040.

The most important trend that the EIA sees is the decoupling of emissions and economic growth. The EIA foresees that economic growth and electricity demand remain linked, but the linkage is shifting toward less energy intensity ($ of GDP per unit of electric power). The drivers for this include population growth, saturation of electricity using appliances such as refrigerators, higher efficiency of newer appliances, and a shift in the economy toward less energy-intensive industries. The efficiency standards for lighting and other appliances help to reduce the growth in electricity demand.

It is very interesting that the EIA projects substantial growth in the renewable generation share of electricity generation with or without the CPP. It projects that renewable generation will more than double with or without the CPP. Low cost solar PV installations, tax credit extensions and state renewable mandates are the important drivers.

The CPP is projected to be a major driver for natural gas generation. The natural gas share is projected to grow by 44% by 2040 with the CPP, compared to 32% in the No CPP case.

Coal generation is projected to decline by 32% from 2015 to 2040 with the CPP. Without the CPP coal generation is projected to remain flat as fewer plants are retired and the remaining units are assumed to operate at higher levels. Overall the coal share of total generation is projected to decline since no significant new capacity is expected to be added.

Nuclear generation is projected to remain flat throughout the projection, with new nuclear power units being offset by retirements of existing capacity. The EIA projects that no new, unplanned, nuclear plants will be constructed even with the CPP in place. According to the projection the nuclear share of total generation declines from 2015 level with or without the CPP.

May 30, 2016

MGM Resorts is planning to leave the NV Energy grid in October even if it has to pay $86.9 million in exit fees to do so. Wynn is also planning to leave the NV Energy Grid late this year, but is challenging the legality of the calculation of exit fees. The casinos plans to leave NV Energy under a law passed in 2001 law that allows corporate customers to leave the power grid. MGM's plan to exit still requires Public Utility Commission (PUC) approval. Another casino Las Vegas Sands Corp has dropped plans to leave the grid, presumably deterred by the large exit fee.

Reportedly a number of states in the U.S. have legislation allowing large corporate customers to leave the grid. In about 18 of these like Texas and Georgia there are no exit fees. In others such as Nevada there are exit fees though the legality of large exit fees is being challenged.

NV Energy has a green tariff program, called Green Energy Choice in northern Nevada, that offers customers the choice of using 100 percent or 50 percent renewable energy but they have to pay an additional amount on their monthly bill for using renewables. But MGM Resorts has signed 20 year power purchase agreements (PPAs) with independent power suppliers who can supply renewable power much cheaper than NV Energy can. So much cheaper that MGM is willing to pay over $80 million to leave the grid.