USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

Search

April 22, 2016

The U.S. Department of Energy has announced $4 million for projects to launch the Orange Button℠ initiative. To understand the financial risk of solar energy project development, the solar energy community relies on fragmented datasets released by state energy offices and a limited number of private organizations regarding project origination, grid integration, operations, and retirement. These datasets vary widely in format, quality, and content, which makes it difficult for potential providers to have an accurate understanding of potential markets. The goal of the the Orange Button project is to standardize this data which will ensure a more standardized and transparent marketplace.

The DoE selected four organizations, Smart Grid Interoperability Panel (SGIP) , SunSpec Alliance, kWh Analytics, and the National Renewable Energy Laboratory (NREL), to lead the Orange Button initiative. The goal is to create a widely adoptable, unified data standard for the solar industry. SGIP is receiving $600,000 to help organize and monitor the Orange Button℠ initiative.

A robust data infrastructure for the solar industry is needed to enable rapid and seamless data exchanges between producers and consumers of solar data. It is estimated that a reduction in the cost of capital as a result of better access to solar data will reduce risk and could result in savings of nearly $9 billion over the next 10 years to solar users.

The Orange Button initiative is modeled on the very successful Green Button initiative which enabled consumers to share their electric power usage information with third parties. Orange Button, which was originally named Solar Bankability Data to Advance Transactions and Access, aims at reducing soft costs through standards for the collection, security, management, exchange, and monetizing of solar datasets across the value chain of solar energy.

April 07, 2016

Keep a close watch on the Hawaiian power utilities (HECO) because Hawaii has committed to 100% renewables by 2045 and 65% by 2030. That was the recommendation of Sharon Allan, CEO of the Smart Grid Interoperability Panel (SGIP) in response to a question asking what should utilities do to get ready for Grid 3.0.

Hawaii has seen an exponential rise in rooftop solar PV. Hawaii already has 487 MW of solar PV capacity, 90% of which is residential rooftop panels. As a result since 2010 the utility has been experiencing a reduction in net load and revenue. The most dramatic drop in load is during daytime hours as is readily apparent by comparing the utility's load on clear and cloudy days which reflects the impact of residential solar PV.

Last week Hawaiian Electric Co. asked the state PUC to approve a $340 million Smart Grid Foundation Project. The companies are requesting permission from the PUC to install smart grid technology for more than 455,000 customers on O'ahu, Hawai'i Island and in Maui County. The application includes a wireless communication network, smart meters and automation technology. The expected benefits are improved outage detection and restoration, provision of electricity usage information to help customers manage their power usage, and automated services such as remote meter reads and connect/disconnect requests. Customers will also have access to a energy portal, accessible on mobile and other devices, to provide more control over their energy usage. But perhaps most importantly this is an important prerequisite to reaching Hawai'i's 100 percent renewable electricity goal. This will require a rate increase, estimated for residential customers at about 23 cents per month for 20 years. Reportedly, commercial customers will bear more of the cost.

HECO collects a large amount of data from diverse sources. It is comprised of internal operational data plus data from esternal sources. It includes weather forecasts, data from customer sited PV, consumer/public resource data, phasor (PMU) data, renewables power quality, feeder data, irradiance meters, generator and substations power quality, SCADA, and vendor data such as forecasts and data from Independent Power Producers (IPPs). Much of this includes location. Geospatial technology is fundamental to integrating this data. Advanced spatial analytics will allow Hawaiian power utilities to extract actionable information in real time.

Fundamental to distributed energy management is another geospatial data source, historical, current and forecast weather maps. Maps of irradiance and wind velocity are key to monitoring and predicting solar and wind generation in different parts of the islands. Temperature maps are also important because the efficiency of solar panels depends on temperature. The more accurate the weather forecasts are, the more predictable the generation. Knowing that there it will be cloudy over the south of Oahu, but sunny over the rest of the island, or cloudy over the whole island, but with moderate to strong winds allows operations to estimate how much backup generation is required and where.

March 14, 2016

In a recent presentation announcing Tesla Energy Elon Musk estimated that the total surface area needed to generate enough power to get the U.S. completely off fossil fuel power generation is less than 1/4 of the Texas panhandle.

Solar PV has reached or is approaching parity with fossil fuel energy sources in much of the U.S. and distributed power generation, primarily rooftop solar PV, is disrupting the traditional electric utility business model. The fundamental problem is that as customers install solar PV on their rooftops or find alternative sources of power and even leave the grid, the utility's revenue, which is derived from selling electricity, declines. This in turn drives up rates for the utility's other customers which motivates them to also defect. Put another way the electric power market, which used to be based on a regulated monopoly provider, is giving way to a competitive market where the consumer has a choice. She can generate her own power or buy it from alternative sources such as Solar City. As battery costs drop, customers increasingly have the option of leaving the grid and participating in their own or a community microgrid.

I've blogged about several alternative solutions that utilities and regulators are proposing to meet the challenge of distributed renewable energy. The New York Public Utility Commission is moving the New York state utility industry toward a radical refinition of the utility business model. In the future in New York state the utility industry may be comprised of distributed system platform (DSP) providers, basically providing the grid but not directly selling energy. There will be an energy market with many energy providers including bulk power generators and many distributed energy generators - you and me with rooftop solar panels. Sacramento Municipal Utilities District ( SMUD) is moving from being a traditional centralized utility to a distributed utility providing localized grid services. In other words SMUD is getting out of the business of selling electricity and into the business of selling grid services. In Maryland the Energy Future Coalition (EFC) prepared recommendations for the Maryland smart grid among which the key recommendation was a new utility business model that would decouple utility revenue from selling electric power. Recently it was predicted that by 2020, the largest energy company in the world (by market cap) will not own any network (grid) or generation assets. It will just manage information about energy sources and consumers similar to Uber and Airbnb in their markets.

Some utilities like Hawaii's power utilities are embracing distributed renewable energy and adapting their business model. The Salt River Project has introduced a rate structure designed to charge solar PV customers for using the utility's grid infrastructure, but which incentivizes them through "time of generation" pricing to generate power at peak times when demand is high. Other utilities see rooftop solar PV as a threat to their revenue base and are penalizing customers who install rooftop solar PV, either by a fixed infrastructure charge or by reducing the rate the utility pays for customer-generated power.

I just came across another approach to address the problem of customers generating their own power leading to declining revenues for the utility. One utility's strategy is to not only lower the rates paid for customer-generated power but to make them (at least large customers) pay a fee (penalty) to leave the grid. There is an open question about whether this is even legal and it is being challenged in the courts.

NV Energy, which is owned by Berkshire Hathaway, is the power utility for Las Vegas. The casinos, to reduce their power costs which are significant as you might expect, have started finding alternative sources of power. For example, the Mandalay Bay is installing the largest rooftop solar PV array in the U.S. on top of its convention center (6.4-MW dc system with 21,324 solar modules expected to provide 20% of the resort’s power demand.). Others are looking to buy power on the open market. With the available alternatives the casinos and resorts are planning to leave the grid and Nevada Energy's monopoly. They want to do this because it saves them money.

The resorts and casinos consume about 7% of NV Energy's power. If they leave the grid, it will create a significant hole in NV Energy's revenue. NV Energy is an investor-owned utility and makes a profit which returns value to its investors. NV Energy's solution (with the acquiescence of the state power regulators): charge customers a fee for leaving the grid. According to the source, it will cost the casinos and resorts a fee (penalty) of $126.5 million to leave the grid. The penalty is to compensate NV Energy for investments it made to generate power for the casinos and to prevent raising rates for NV Energy's remaining customers.

The casinos and resorts are not the only customers of NV Energy to face a penalty for leaving the grid. A data storage company called Switch had the same problem last year when it decided to use renewable power to power its data center. It uses 3% of NV Energy's power and the company was told it had to pay $27 million to NV Energy to leave the monopoly's grid. Reportedly Switch and NV Energy came to an agreement where Switch is paying NV Energy to build a new solar farm in North Las Vegas. This looks like a policy whereby NV Energy pressures its big customers to remain with the monopoly and pay for its sustainability initiatives through the threat of a substantial penalty for leaving the grid.

March 08, 2016

At this year's RICS BIM conference the elephant in the room was April 4th 2016, the deadline for all public construction projects to be BIM Level 2 compliant. This is the culmination of a five year process starting in 2011 focussed on developing a set of documents defining BIM Level 2 and providing best practices for achieving compliance.

This year's conference suggests that for many in the industry they are now wrestling with implementation issues and are reporting a shortage of BIM skills. They are finding that BIM is really about data and are realizing that data management and analytics are essential to handling and interpreting the huge volume of data that is being generated. A shortage of data scientists to help them to do this has become an acute problem.

Transforming the UK Construction Industry in Five Years

The government's focus from the beginning has been on full life cycle BIM. Back in 2010/2011, the Government's BIM roadmap was perceived to be very aggressive, mandating BIM Level 2 compliance for all national government projects after just five years. In the UK public projects comprise about 50% of the construction industry so this amounted to radically transforming the UK construction industry in half a decade. Put another way the construction industry had to move from thinking analog to thinking digital in just five years.

The utility industry has been going through a similar transition in moving to what is called a smart grid and distributed generation. Some think that this process will take 20 years. But new technology and a younger generation steeped in digital is stepping in to make this happen much sooner than that. Energy is going digital more rapidly than expected. I think it likely that something similar will happen in the construction industry because digital enables all sorts of efficiencies that we often cannot even imagine. Uber and Airbnb are classic examples. Both are the largest or second largest companies in their respective industries. Gartner has forecasted a similar type of company will emerge for managing energy flows in the electric power industry by 2020.

For 2012 the Government's strategy focussed on discovery through pilot projects, notably the Ministry of Justice Cookham Wood project, a £20 mllion project where it is estimated that BIM saved hundreds of thousands of pounds in CAPEX costs.

At the 2013 conference David Philp,Head of BIM Implementation, Cabinet Office, UK Government, announced PAS 1192-2 a key document supporting the Construction BIM Strategy to achieve Level 2 compliance. It specified requirements for achieving BIM Level 2, sets set out the framework for collaborative working on BIM enabled projects and provided specific guidance for the information management requirements associated with projects delivered using BIM. David Philp made it clear that although the initial focus is on the design/build part of the lifecycle with the goal of saving 20% of Capex and reducing the 30% waste in most construction projects, the "largest prize for BIM lies in the operational stages of the project life-cycle".

At the 2014 conference Peter Hansford, Government Chief Construction Advisor, made a link between the overwhelming interest in BIM as represented by the sold-out RICS BIM conference and the industry’s enhanced interest and appetite for BIM. He saw a shift in emphasis from asking ‘’should we do BIM?’’ to ‘’how do we do BIM?’’ This was happengin inspite of the fact that as, Rich Saxon, the UK Government's BIM Ambassador for Growth, pointed out In a panel discussion on the future of BIM that some aspects of BIM Level 2 were still incomplete.

Deborah Rowland, Head of Facilities Management, Government Property Unit, Cabinet Office in the UK Government outlined the Government's motivation and plans for a program to improve the post construction handover and operation of newly constructed buildings. She described the Government Soft Landings (GSL) program which was intended to rectify this and was designed to put in place a legal, contractual, and technical framework (based on the BSRIA Soft Landings Framework) incorporating building information modeling (BIM) to fix the problem by ensuring continuity thoughout the buiiding lifecycle from inception, though design, construction, commissioning, training and handover through to operations and maintenance for 1-3 years after handover.

At the 2015 conference seven key BIM Level 2 documents (PAS 1192:2, PAS 1192:3, BS 1192-4, CIC BIM Protocol, Classification (Uniclass), Digital Plan of Works / Levels of Detail (LoD), Government Soft Landings) which collectively represented the current repository of BIM best practices were announced. Every speaker at the conference referenced these documents.

Seven major government projects using BIM, worth about £ 10 billion, have maintained records of the savings they have seen from using BIM. The measured benefits of BIM and factors associated with BIM range between15-20 %. By 2014 the government estimated that it had achieved savings of £ 1.4 billion through BIM and related improvements in the 2013/2014 fiscal year. But with full lifecyle BIM, the government saw an upside potential of 40%+ with the biggest benefits being realized in the operations phase of the building lifecycle.

This year it was announced that Government statistics show that in 2015 Her Majesty's Government saved GBP 3 billion on these demonstration projects as a result of gains in construction productivity related to BIM.

Full Life Cycle BIM

But BIM Level 3 is where the gold is. Level 3 will enable the interconnected digital design of different elements in a built environment and will extend BIM into the operation of assets over their lifetime. It will support the accelerated delivery of smart cities, services and grids. As the AGI has pointed out in a recent Foresight Report integrated BIM and geospatial technology will be fundamental for BIM Level 3.

Is the UK construction industry ready for April 14 2016 ?

An industry survey in the UK in 2013 reported that 54% of respondents said that they were using BIM on projects. Most of those not yet using BIM said that they would be within one to two years. 70 % of the respondents reporting using BIM said that BIM has given them a competitive advantage. But this year's conference suggests that most in the industry are doing BIM although for some it is not really clear what BIM Level 2 is and how one demonstrates compliance.

Data and People

This year's RICS BIM conference was quite different from previous conferences. The focus was heavily on implementation, not on progress in completing specifications or developing and working with 3D models as in previous RICS BIM conferences.

The event was led off by a panel discussion that set the tone for the rest of the conference - the focus was on data and people rather than 3D BIM models. The industry people on the panel were patently up to their elbows in real world BIM projects including data, the analytics to make sense of it, and the shortage of the required skills (data scientists and people with BIM skills).

Alex Jones, Interserve, said that within the construction industry there is now a focus on data and how data is used downstream. He said that clients (owners) are now aware of BIM and that both private and public procurements are asking for BIM. But he qualified this - they are not so much asking for BIM (as in 3D models) as asking for data that can be used downstream. David Throssel, Skanska, added that deliverables now means data.

In the context of data, Simon Rawlinson, Arcadis UK, gave an example of utilities where smart devices and automated capture has resulted in a database of asset conditions, and which also includes geospatial data for geolocating those assets. This information allows utilities to move forward to condition-based maintenance where prioritizing maintenance and replacement depends on asset condition not the calendar. Real-time monitoring is being adopted by virtually all industries. For example, devices are being implanted in structures such as bridges to monitor distortions. He added that the leadership has not yet recognized that sensors and analytics enable a major change that will revolutionize asset maintenance.

Also the new digital economy is disintermediating certain skills. For example, digitalized construction requires fewer layers. Simon cited a KPMG study of disintermediation by technology. As an example, he mentioned data such as that collected by utilities from smart devices that allows owners/clients/utilities to do condition-based maintenance, which can lower field staff requirements significantly.

In the construction industry as a whole one of the major issues that emerged is a shortage of BIM related skills.Simon emphasized that now that we have lots of data, what we need urgently are data scientists to help make sense of it. He identified this as a major problem because in his experience data scientists are as rare as “hens teeth”.

David Hancock, UK Government Construction, said that construction industry still thinks analog. Nowadays we can all produce data, but analyzing it is the challenge. We need people who can analyze to produce actionable items. He also identified BIM skills shortage as a major problem, and repeated that data scientists people are “as rare as hens' teeth. He suggested that one solution is upskilling – training existing construction/bldg maintenance/utility folks in BIM and data science. Another may be raiding the U.S. baseball industry where teams such as the Oakland Atheletics starting using analytics several years ago. "Moneyball" analytics has been recommended as a strategy for the utility sector in the U.S.

In the question period a speaker argued that the 3D BIM model has been "massively overplayed". BIM is really about data and collaboration, not about 3D. To maximize the benefits of BIM clients (owners) have to be clear about what the model, and much more importantly the data associated with the model, are going to be used for. For example, if for facilities management (FM) then clients (owners) have to be specific about what data is required for maintaining and operating the building as part of defining deliverables.

BIM Level 2 for BIM Novices

What if you are a small company with no more experience with BIM than discovering what the acronym stands for ? At RICS BIM 2016 there was a sequence of presentations by a group of mostly quantity surveyors (people who do quantity takeoff), all BIM novices, who came together to try to understand and share openly what BIM Level 2 means practically. (As one speaker pointed out, there is no legal definition of BIM Level 2 compliance. They called their project KT4BIM, which you can find on LinkeIn. The objective is to answer the questions

how do we do BIM Level 2

how do we demonstrate compliance.

The target audience is BIM novices and for both the residential and commercial sectors. The objective is to share all the material used and developed as part of this project with the industry. More information about this very important project will be provided in my next blog post.

BIM Level 2 Documents

I've included this list just to show how comprehensive this set of BIM and BIM-related documents is.

CIC/INF MAN/S first edition 2013 Outline Scope of Services for the Role of Information Management

Government Soft Landings "Design for operation" to encourage better outcomes using BIM for built assets during the design and construction stages to ensure value is achieved in the operational lifecycle of an asset.

February 26, 2016

According to GTM Research and the Solar Energy Industries Association new solar PC capacity in the United States set a new record in 2015. 7.3 gigawatts (GW) of solar photovoltaics (PV) were installed, exceeding natural-gas capacity additions for the first time. In 2015 solar was up 17% over 2014 and represented almost a third of new electric generating capacity additions in the U.S.

More than half of the new solar PV additions were utility-scale solar farms which was also a new record. Over 4 GW of utility-scale solar PV were installed which represents 6 percent year-over-year growth in this segment. The residential solar market grew 66 percent year-over-year. The non-residential market installed more than one gigawatt of new capacity in 2015, about the same as what was installed in this segment in 2014. The increasing rate of installations has been driven partially by dropping prices.

87% of U.S. solar PV capacity is concentrated in 10 states.

Total U.S. solar PV capacity now exceeds 25 GW, which is remarkably rapid growth when compared to 2010 when only 2 GW had been installed.

For comparison according to the American Wind Energy Association (AWEA) in 2014 about 2,500 turbines were installed, representing nearly 4.9 GW of new wind capacity. Wind energy represented 28 % of new generating capacity installed in 2010-2014. U.S. total wind generating capacity at the end of 2015 was 65.9 GW.

Capacity is one thing and actual generation another. According to the U.S. Energy Information Administration (EIA) in 2014 actual generation statistics show that coal, natural gas, and nuclear led by a wide margin followed by conventional hydroelectric, wind and solar.

February 25, 2016

Keep a close watch on the Hawaiian power utilities (HECO) because Hawaii has committed to 100% renewables by 2045 and 65% by 2030. That was the recommendation of Sharon Allan, CEO of the Smart Grid Interoperability Panel (SGIP) at a DistribuTECH2016 MegaSession on distributed energy resources (DER) in response to a question from the audience asking what should utilities do to get ready for GRID 3.0.

Hawaii already has 487 MW of solar PV capacity, 90% of which is residential rooftop panels. At DistribuTECH2016, Talin Sakugawa from HECO and Matthew Shawver from in2lytics gave an insightful presentation on distributed energy generation (DER), big data and analytics at the Hawaiian power utility. Hawaii has seen an exponential rise in rooftop solar PV. As a result since 2010 the utility has been experiencing a reduction in net load and revenue. The most dramatic drop in load is during daytime hours as is readily apparent by comparing the utility's load on clear and cloudy days which reflects the impact of residential solar PV.

Challenges facing utilities

The Rocky Mountain Institute (RMI) has published an analysis of the economics of leaving the grid. The central thesis is that solar PV and electricity storage enables consumers to leave the grid completely. Solar PV has already reached grid parity in about 10% of the U.S. and declining PV prices suggest that this trend will continue. However, without the ability to store electric power, consumers with rooftop PV still require the grid for nights and cloudy days. The development of combined solar PV and batteries from Tesla and others promises to make solar + storage accessible for increasing numbers of consumers. These consumers could form their own microgrid, either by themselves or with their neighbours and disconnect from the grid. Alternatively they could become a source of dispatchable power and become an energy provider. Increasingly they could find that either of these options is economically advantageous. This has serious implications for local utilities because if this trend develops it will seriously erode the traditional utility revenue base. It could also lead to a completely decentralized grid comprised of many microgrids. From an operations perspective it increases the complexity of managing the grid compared to the centralized model in use today.

The other challenge for a utility with a high penetration of intermittent, distributed generation (DER) is load balancing, ensuring that generation meets demand. The German power industry, the U.S. Department of Energy, Hawaiian power utilities, California power utilities, ERCOT, and many others are working to address this challenge. HECO's presentation at DistribuTECH gave an insightful overview of what they are doing to meet this technical challenge.

Data sources

HECO collects a large amount of data from diverse sources. It is comprised of internal operational data plus data from esternal sources. It includes weather forecasts, data from customer sited PV, consumer/public resource data, phasor (PMU) data, renewables power quality, feeder data, irradiance meters, generator and substations power quality, SCADA, and vendor data such as forecasts and data from Independent Power Producers (IPPs). The data has widely different time resolutions ranging from real time such as PMUs sampling 30 times per second, SCADA reporting every 2 seconds, calculated gross load every 2 seconds, PV inverter data reporting every 5 minutes and aggregated monthly, weather forecast reporting every 15 minutes and aggregated daily, transformers collecting data every 15 mins and aggregated monthly, and smart meters reporting every 15 minutes and aggregated quarterly. Much of the data includes location. The data comes in different formats and has different levels of quality.

Fundamental to distributed energy are weather maps. Maps of irradiance and wind velocity can be directly used to estimate or predict solar and wind generation in different parts of the islands. Temperature maps are also important because the efficiency of solar panels depends on temperature. The more accurate the weather forecasts are, the more predictable the generation. Knowing that there it will be cloudy over the south of Oahu, but sunny over the rest of the island, or cloudy over the whole island, but with moderate to strong winds allows operations to estimate how much backup generation is required and where.

EMS Integration

Data from new sources relating to distributed renewable generation are integrated with traditional sources in a distributed energy management system which enables operations and planning visibility into how distributed generation is impacting thte grid. It not only provides a view into the status of the grid (situational awareness) but also allows simulations to assess the impact of different future renewable generation scenarios on the grid. This helps to determine where backup generation may be required because high demand and low generation from renewables or curtailment when there is excess generation and insufficient load. Location is fundamental to the analytics, because wind and irradiance varies widely over the islands.

Traditional input sources include SCADA and data from transmission, generators, protection, and so on. The new input data sources include GIS-based infrastructure models and data from the distributed generation sources. Also new is weather forecasting which enables predicting generation from renewable sources. It also includes historical and actual weather reports, and satellite and other weather data.

HECO uses an IT platform from In2lytics (a spinoff from Referentia) which integrates all of these time series data with EMS, SCADA, CIS, and GIS and makes it available to operational users, planners, modelers and the public. in2lytics is a high performance time series database that is specially designed to provide instant data accessibility for planning and operational decision making. in2lytics enables load, query, analysis using MATLAB, and sharing of the "big data" required to monitor and manage today’s complex electric grid. in2lytics has native high performance interfaces for MATLAB in addition to programming languages. It translates data from different sources into its internal time series database. All data is archived in a spatially enabled time series database for future analysis. Matthew said that in2lytics is able to analyze large volumes of historical time series data very rapidly. For example, a longitudinal analysis on two years worth of archived data can be run in an hour or two.

Applications

Accounting for renewables requires a lot of data. Some areas that use this data include generation planning, load forecasting, distribution planning, and contract administration. One of the first applications is a customer facing web site called Renewable Watch - Oahu that reports the total renewables generation, solar and wind, on Oahu in real-time.

An example of an application that will be used for contract administration beginning next month estimates how much power is lost from Independent Power Producers (IPPs) in the case of curtailment. It uses weather information to estimate wind speed and irradiance meters to estimate wind and solar generation potential.

Another analytical application was used to study how solar PV variability affects transformer tap changes. This application allows HECO to relate frequency of tap changes to solar PV variability and to determine which transformers are most affected by solar variability.

HECO has a number of projects with the Department of Energy and other partners.

Department of Energy's Integrating System to Edge-of-Network Architecture and Management" (SEAMS) A federally-funded research initiative on high penetration grids which aims to streamline grid planning and operations for utilities in regions with high concentrations of distributed generation (DG) resources.

Department of Energy's Distributed Resource Energy Analysis and Management System (DREAMS) Department of Energy system to provide useful information about the distributed grid to grid operators.

Making sense of synchrophasor data for utilities. The Synchrophasor Visual Integration and Event Evaluation for Utilities (SynchroVIEEU) with High Penetrations of Renewables. Accelerate the integration of synchrophasor information into production grade data visualization and analysis platforms/models. Leverage PMU capability at many substations – explore ways to tap resources and provide real-time visibility and real-time data. Make synchrophasor data accessible for efficient and reliable operations of a modern grid in light of high penetrations of renewable resources

Partners: SEL, DNV GL, Referentia

Monitoring, planning, and modeling high PV penetration microgrid. Decision support for microgrid design and operation at Marine Corps Base Hawaii. The Office of Naval Research (ONR) has launched an ambitious program to demonstrate and evaluate energy technologies using Navy and Marine Corps facilities as test beds, known as the Energy Systems Technology and Evaluation Program (ESTEP). Program management is being handled by the Space and Naval Warfare Systems Command (SPAWAR) Systems Center Pacific (SSC Pacific). ESTEP was established in 2013. It brings together the Department of the Navy, academia, and private industry to investigate and test emerging energy technologies at Navy and Marine Corps installations. At present, ESTEP conducts over 20 in-house government energy projects, ranging from energy management to alternative energy and storage technologies.

Other projects include

Remote light sensor technology at the Kahuku wind farm on Oahu to measure and forecast windspeed and direction to support reliability;

The ultimate objective for HECO is to become a company offering diversified services providing value to engaged customers, of course with satisfied regulators and sustainable costs and margins.

Big data and spatial-temporal analytics

It was clear from this presentation that essential for HECO, or to any utility, with a high penetration of solar or wind are weather maps, forecasted, actual and historical because these enable the utility to project generation. The also allow utilities to analyze the historical record in conjunction with other data to discover trends that may help improve forecasting. Collecting and analyzing temporal geospatial data is fundamental for distributed energy management. This is in addition to the expanded application of geospatial technology for utility operations and planning in the smart grid era.

February 11, 2016

Today at DistribuTECH2016, three very distinguished, experienced people in the technology business, Zarko Sumic, Distinguised Analyst at Gartner; Chandu Visweswariah, IBM Fellow; and Sharon Allan, CEO of the Smart Grid Interoperability Panel (SGIP), each presented their perspective on the utility sector of the future. What emerged appeared to be the same remarkable animal but from three different perspectives.

Zarko started by pointing out the extraordinary coincidence that Facebook (market cap $230b), Uber (est value $50 B), and Airbnb (est value $24 B) are the largest (or the second largest in the case of Airbnb) companies in their line of business, but significantly none of them own content, cars or rooms. Zarko and Gartner call this phenomenon the sharing economy. A sharing economy uses IT to distribute, share and reuse excess capacity in goods and services. Information is the fuel and the digital platform is the engine. The sharing economy operates in a market with customers and sellers and the value of the goods and services shared is determined by the network effect (Metcalfe's law) and increases as the square of the number of nodes (customers and sellers).

Zarko sees an analogy with the electric power industry today, as we move toward a world where energy flows are increasingly determined by market forces (referred to as transactive energy). Based on the analogy with Facebook, Uber, and Airbnb Zarko predicts that by 2020, the largest energy company in the world (by market cap) will not own any network (grid) or generation assets. It will manage information about energy sources and consumers. He does not see any technical barriers to this vision. The only thing standing in the way at the moment is current regulation.

The utility digital distribution platform creates new value by enabling an open energy market which brings together those who have energy with those who want it. It requires a network operator who manages and ensures the reliability of the grid (similar to the role of Network Rail in the UK) and a sharing energy economy platform operator (like Facebook, Uber and Airbnb) who brings together energy providers and buyers including prosumers, and calculates transaction and delivery costs.

Chandu put it this way, "energy is getting digitized". His perspective is that renewable energy, whether wind or solar, introduces a major element of variability into power networks which is seven times greater than in our current networks where most of the variability is due to demand. Weather is the source of most of the variability. Predicting when the wind will blow and with what force and cloud cover are critical for managing future power networks in addition to predicting and managing demand with techniques such as demand response. IBM has two major projects based on predicting these external factors, Deep Thunder for weather prediction and Opus for a power project whose first implementation is in Vermont. In Chandu's view time of use pricing, demand response, and other programs are practical examples of capabilities that utilities have already implemented and that are required to support transactive energy that underpins all three's vision. Transactive energy means fundamentally that market forces control the flow of energy. It requires price signals from the energy supplier and demand signals from the buyer.

Sharon shares a similar long term vision of the energy sector and also sees transactive energy as the future toward which the sector is heading. It is the consumer that is driving the industry. As Jon Wellinghof, ex-chairman of FERC put itthis way

"Advances in technology and the desire we are seeing at the consumer level to have control and the ability to know that they can ensure the reliability of their system within their home, business, microgrid or their community. People are going to continue to drive towards having these kinds of technologies available to them. And once that happens through the technologies and the entrepreneurial spirit we are seeing with these companies coming in, I just don't see how we can continue with the same model we have had for the last 100 or 150 years."

She gave a number of examples of how interoperability standards from IEEE, IEC, and SGIP; programs from government agencies such as the Department of Energy, ARPA-E and Pacific NW National Lab (PNNL); and industry organizations such as EPRI are moving toward a common vision of the utility of the future. Sharon put it very strongly in front of a room filled with utility employees and vendors and consultants, "get ready for transactive energy, this is real, this is not hype."

February 10, 2016

If you would ask me among all the new technologies on display here and at other events which will likely dramatically reduce the bottom line of utilities in the transmission business I would say unmanned aerial vehicles (UAVs) or drones. For example, every utility which has transmission lines spends large sums on overflying, typically with very expensive helicopter flights, transmission lines for vegetation management to monitor the trees to prevent them from interfering with transmissions lines. If instead this could be done with much less expensive UAV flights, the savings would be huge. But right now this is not possible, because commercial operation of UAVs is only allowed by the Federal Aviation Authority (FAA) with visual line of sight (VLOS) rules.

The potential ROI of using UAVs for vegetation monitoring are expected to be sizable, but this requires the FAA to change regulations to permit operating of UAVs with beyond visual line of sight (BVLOS) rules. Today at DistribuTECH2016 Eileen Lockhart of Xcel Energy with partners Environmental Consultants Inc and Flot Systems gave a presentation that showed just how close we are to UAVs operating under BVLOS rules becoming a commercial reality for electric power utilities.

Xcel Energy, which has electric and gas assets in eight states, is the fourth largest utility in the U.S. They have partnered with EEI, EPRI, INL and others in the utility sector to show the way in the application of UAVs in the utility sector. This includes working with the FAA to push forward the practical application of UAVs in the utility sector. Their objectives are improved safety for utility employees and the public, reducing risk to and improving reliability of the grid, and reducing the cost of operating and maintaining grid infrastructure. The technologies involved include GIS and geospatial analytics in addition to the UAVs themselves. Expected sources of major savings are reducing the need for expensive helicopters and plane flights and the time utility employees have to spend in the field.

Just last week (Feb 3, 2016) Xcel with its partners completed the first beyond visual line of sight mission with UAV flights over 20 miles of transmission lines. They are convinced that BVLOS flights will become a commercial reality in the near future. The UAV flights will not be flown by Xcel itself but by contractors such as Flot Systems operating with FAA licences. Xcel has not yet calculated ROIs for the applications using UAVs, but plan to do so soon.

At DistribuTECH Nancy Bui-Thompson, President of Sacramento Municipal Utilities District (SMUD), gave an insightful presentation about her utility from the perspective of an elected official. I think that Nancy is one of the first examples of an elected official presenting at DistribuTECH. SMUD is and has been for some time one of the most forward-looking utilities in North America. It was the first California utility to reach 20% renewable power and the first to commit to 33% renewables. It is also one of the most energy efficient utiltiies in California.

It is governed by an elected board of directors, who are responsible for policy and strategy, and perhaps this is the reason it is very customer focussed and has consistently maintained high customer satisfaction ratings. For example during the rollout of the smart meter program it maintained an impressive 95% customer satisfaction rating. The major customer drivers are green, renewable and reliable, and the need for choice. Nancy emphasized that SMUDs unique governance model reserving policy for the board of directors allows SMUDs management and employees freedom to implement without interference.

SMUD is moving from a centralized utility with a business model based on selling electricity to a distributed utility providing localized grid services. This means SMUD is getting out of the business of selling electricity, and into the business of selling grid services.

At the policy level the focus areas are customer analytics, changing the rate structure and grid analytics.

Customer analytics includes collecting operational data on customer behaviour to help identify and provide new services. For example, these statistics helps identify early adopters, customers who are interested in renewable energy or energy efficiency in the home and who help drive programs focussed on theses areas. Customer data and analytics also helps with segmentation, defining the different market segments that require different types and levels of grid services.

Grid analytics is helping SMUD better manage outages. Collecing operational data and analyzing it to be able to predict outages has reduced the number of outages by 20% and the average durating of outages by 28%.

One of the most interesting innovations at SMUD is in the area of rate structure. SMUD and other utilities are in the interesting position for a retailer of trying to sell less of its product, in this case electricity. People are using less electricity as a result of personal interest as well as SMUD's own energy efficiency programs. But the money has to come from somewhere. SMUD has introduced a flat infrastructure fee. Currently every customer pays $18/month for the grid, independent from how much power they consume. SMUD has determined that $28 is the breakeven point, where the cost of maintaining the grid would be covered by infrastructure fees, and is moving towards that monthly change.

Another interesting innovation at SMUD (which I have seen elsewhere) is their solar power program. Customers can buy into solar power without the bother of having to install solar panels on their roofs. The first solar program was a 1 MW solar PV program designed for residential customers. The next will be a an 11 MW program for commercial customers.

February 09, 2016

I am at DistribuTECH this week in Orlando. This is North America's largest electric power utility event. I don't know the number of attendees this year, but last year it was close to 10,000.

The first talk I heard was a riveting talk that hit on a key theme of this year's conference, big data and analytics. The talk was given by Lee Krevat of Sempra US Gas and Power and Tim Fairchild of SAS and was entitled "What can a regulated electric utility learn from Moneyball ?" Moneyball is a book by Michael Lewis about baseball and how the Oakland A's applied big data and analytics to become one of the top teams but without having the deep pockets of the richest teams.

Moneyball for transformers

As an example of the relevance of Moneyball is a utility that took every bit of data that could potentially be relevant to the lifecycle of a transformer, some 80 variables in all, and ran correlations with transformer failure data - for 1500 transformers. They found that just 10 of the variables predicted 91% of transformer failures. This compares with the utility's traditional way of forecasting failure which only predicted 15% of the failures. If you ask an experienced power engineer what the most important factor in determining the lifetime of a transformer, the historical load, and especially the overload, on the transformer would be at the top of the list. What the utility actually found was that the correlation analysis showed that load was only the eighth most important factor. The analysis showed that the most important factor was the number of meters attached to the transformer - which probably wouldn't have been on anyone's list.

Lee and Tim presented other analogies where analytics play a key role in both baseball and electric power.

The good face

Baseball scouts often profile players by the look of their face, body, and a "baseball look", not always on their actual statistics. Similarly utilities often profile circuits to prioritize capacity investment decisions, often based on "gut" feeling. Now they are able to leverage data to make better investments.

Age before beauty

Scouts show a preference for younger players even though baseball players with better statistics, but who are older, are also often available without long term contracts and at a significantly lower salary. By analogy large utility transformers have traditionally been replaced based on age. But now statistics on many transformers are being monitored and the transformers only replaced if their statistics are in decline and predict failure.

Lee and Tim suggested other practical applications of this approach which include using machine learning to predict failures of wind turbines and to determine why solar farms often generate less electricity than expected.