Research Highlights

Moving Megawatts Securely and Economically

PNNL and Bonneville Power Administration have developed an open-source tool that performs validation, calibration, and verification of power plant models.

Thirteen years ago, PNNL engineer Henry Huang embarked on an idea to make power grid planning tools more efficient and save money in the process. Now, the results of that concept are in the hands of utilities on the West Coast, helping them determine where investments will lead to a more resilient, reliable and economical electric grid for their customers.

This feature will not only help large power plant owners meet federal testing requirements for power generation planning purposes, but also aid major Independent System Operators (ISOs) in making investment decisions to plan and operate today’s power grid.

“It’s a great example of moving a new idea from concept to commercial adoption,” said staff research engineer and program manager, Ruisheng Diao, Huang’s colleague at PNNL who took the final software over the finish line with the project team. That team included Bonneville Power Administration (BPA)—who also funded the work—GE Energy Consulting, GE Grid Solutions, and Peak Reliability.

“It was a long journey, but that’s the nature of the beast.”

From Concept to Commercialization

From Concept to Commerce

Technology Readiness Levels are a type of measurement system used to assess the maturity level of a particular technology. These levels help management make decisions concerning product development and readiness to transition to industry.

Technology Readiness Levels

Basic principles observed

Technology concept formulated

Experimental proof of concept

Component validation in lab setting

Component validation in working environment

Component prototype demonstration

System prototype demonstration

The road to commercial acceptance of any new technology is lengthy and rigorous. Diao says this is especially true in the power industry, where they tend to stick with tried and true technology.

“In the development phase, software prototypes coming out of new concepts are usually too fancy and risky for commercial power applications. Nobody wants to spend the time and money on lengthy training for engineers and operators unless they have to,” he explains.

Complex problems sorted out in the research lab need to be simplified and adapted to work within existing commercial tools that have been widely used by power industry for decades. It can take many iterations and lead to a totally different, but much more useful, end product.

In this case, the path from geeky lab prototype script to a sleek new commercial software tool for GE evolved in two key phases.

A streamlined model validation process. To run one validation study, it can easily take weeks for a plant engineer to collect all the necessary information such as operational snapshots, event information, sensor data, and monitored channels. After that, the plant generators come offline at up to $35,000 a pop to test a scenario through a series of calibration and validation steps. This is a very labor and time intensive process. Industry desperately needed an automated tool that was both fast and easy to use.

The concept for the PPMV tool was invented in early 2000 by BPA, GE, and PNNL. Their concept incorporated real-time frequency and voltage signals from phasor measurement units, or PMUs, placed throughout the electrical grid. With these “play-in” PMU signals, their prototype concept and algorithms ran great—in the developer script, Visual Basic.

“This software environment is popular for research and users who know exactly where to grab information, but it’s anathema to industry,” said Diao.

Building toward commercial integration with the automatic “play-in” function in PSLF streamlines the process of validating hundreds of power plants whenever a system event occurs. This new tool can cut the entire process time from thousands of labor hours down to just a few minutes.

An enhanced calibration algorithm. For secure and economical operation of the power grid, the integrity of the power plant planning model is essential. When models show deficiencies compared to field measurements, any decision based on the model will be less than optimal.

“To plan around the model, it has to be right or it’s a waste of money and time,” said Diao.

The early phase of PNNL’s model calibration algorithm was written in MATLAB research scripts and the performance was very slow—more than 24 hours to calibrate the model parameters of a single generator unit. It couldn’t handle nonlinear dynamics very well either.

Through a 3-year (2011-2014) project funded by DOE’s Advanced Scientific Computing Research program, Huang and Diao’s team developed a promising new method to calibrate flawed parameters for a power plant in a more efficient and accurate way. The algorithms showed great potential but the models being used were over-simplified—they couldn’t represent the realistic behavior of a power plant.

An Integrated Solution for the Power Industry

During the next phase of development between 2015 and 2017, Diao’s team and their BPA counterparts collaborated to successfully enhance and adapt the validation and calibration algorithms for commercial adoption. Their research reduced the time for model calibration from more than 24 hours to less than 1 minute. That work also earned the team a best conference paper from the 2017 Institute of Electrical and Electronics Engineers’ Power and Energy Society General Meeting.

The different planning and operation pieces are now connected and integrated into GE’s commercial products: eterra-phasoranalytics and PSLF. This not only makes GE’s software more efficient, but reduces the engineers’ training burden for one-off tools.

Over the next few years, testing data from both regional utilities and ISOs will provide valuable performance feedback on the new software tool suite. That data will inform improvements to the next version of the software.