Post

U.S. Battery Storage Nears Tipping Point, Drives Energy Transition

Merriam Webster defines tipping point as “the critical point…beyond which a significant and often unstoppable effect or change takes places.”

While by definition it is impossible to identify a tipping point as it is happening, developments over the past several weeks certainly seem to be pushing the electric power storage industry to or past that landmark. In a couple of years, the last couple of weeks may stand out as the point when storage morphed from being an interesting add-on to an essential piece of the grid, providing backup power, smoothing variable generation and participating in various other ancillary markets.

This transition is projected in the latest forecast from the Energy Storage Association and GTM Research. In their Q1 review (executive summary is available here), they project that annual storage installations in the United States will jump from just 215 megawatts in 2017 to more than 3.3 gigawatts by 2023. Just as telling, AES, which has partnered with Siemens in a storage joint venture dubbed Fluence, said in a presentation (available here) accompanying its annual results that it expected the global market to hit 28 GW of installed capacity by 2022—a tenfold increase in five years.

Both of those forecasts were prepared well before the March 2 decision by the U.S. Internal Revenue Service allowing a homeowner to claim the existing 30 percent investment tax credit (ITC) for a battery storage system added to an existing rooftop photovoltaic (PV) system, provided that the battery only store energy generated by the onsite PV panels. The ruling was issued with a disclaimer that it only applied to the homeowner seeking the advice and could not be cited as precedent.

Disclaimer notwithstanding, it is hard to believe that the IRS won’t use the ruling as precedent internally. In other words, when the next homeowner asks the same question—Does this storage retrofit qualify for the ITC?—the IRS examiner will see that the agency’s answer in a prior case was yes, making a positive answer in the second case (and the third, and so on) that much more likely.

And the retrofit sector could be a huge growth market for the storage industry in the next couple of years. There are an estimated 1.6 million existing solar installations in the U.S., primarily serving the residential or commercial and industrial sectors. Very few of those systems have an accompanying battery storage system.

Taking advantage of this IRS ruling to install storage could be a particularly enticing proposition for customers paying demand charges based on their peak power consumption. Batteries also enable customers to maximize their use of onsite generated power, which is desirable in areas with peak time pricing. Instead of buying expensive power from the utility, customers can tap their solar-charged batteries for electricity during these peak periods.

However, that is in the future. More telling are those activities occurring now, every day, across the country—indicating that if the tipping point isn’t here, it is coming, and likely soon

In a webinar presentation March 7 (found here), Sean Hamilton, general manager of the Sterling Municipal Light Department, talked about the success of its year-long deployment of a 2 MW/3.9 megawatt-hour battery storage system from NEC.

Since the system went online in December 2016, Hamilton said, it has saved the utility more than $419,000, primarily by cutting the small utility’s transmission-related charges from ISO-New England, the regional transmission system operator. Given a total installed cost of slightly more than $2.5 million that translates into a simple payback period of less than seven years, without factoring in any resiliency benefits (more on that below).

ISO-NE’s transmission charges are split into two categories. First is a capacity charge that each utility must pay to use the regional network. This levy is based on a utility’s peak demand during the region’s annual peak hour, with pricing based on the three-year forward capacity market. The second fee, dubbed regional network services (RNS), is a monthly charge based on a utility’s demand during the peak transmission hour in its state.

The graphic below illustrates the storage system’s success in helping Sterling, which serves fewer than 4,000 customers, shave its load during the region’s peak demand hour in 2017. That roughly 1.5 MW reduction for the hour saved the utility $244,460 for the year, Hamilton said. Similarly, the utility was able to cut $162,107 from its monthly RNS charges with the storage system.

In addition to the monetary savings, Hamilton pointed to the resiliency benefits offered by the project since it is located near the town’s police and fire stations and would provide backup power for those critical loads, pegged at 10 kilowatts, during a power outage.

An analysis done by Sandia National Laboratory, which is participating in the project, showed that a backup system with 4 MWh of capacity could provide backup services for these facilities for as long as 16 days, providing a benefit of at least $166,000.

Perhaps the biggest endorsement of the project’s success is the fact that Sterling is rolling out a second project, this one a 1 MW/2 MWh battery system linked with a 1 MW community solar PV system.

On the other side of the country, batteries are being used for an entirely different purpose—variable generation smoothing—and on a much bigger scale in the daily operations of the California Independent System Operator. For starters, if you have never looked at CAISO’s web site (www.caiso.com), you should, it is a font of information.

The chart below was created using data from the CAISO site and covers operations for a 60-minute period on the morning of March 6, 2018. The data is available here.

The takeaway from the graphic is clear. Batteries are being used to help smooth short-term fluctuations in output from variable renewables, in this case solar. The grid was humming along, with solar PV generating around 9,300 MW of electricity (This just a day after the California grid topped 10,000 MW of solar generation for the first time, but that is for a different post.) and the system’s integrated batteries essentially sitting idle.

Then, as the solar began to tail off slightly, ISO operators tapped into the batteries and injected that power into the grid for the next 20 minutes. Once the PV production began to ramp up again, operators dialed back on the batteries’ output.

The scale between the two resources is off since solar has such a commanding head start in installed capacity, but that almost certainly will change as additional battery systems are brought online. What is crucial is the usage itself, with the pattern repeating throughout the day in the CAISO data: Batteries are being used constantly to smooth the output of the system’s variable generation.

Thank Dennis for the Post!

Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.

Discussions

“Given a total installed cost of slightly more than $2.5 million that translates into a simple payback period of less than seven years”

A 7 year payback for an industrial installation of an asset with a 10 year life span is a non-starter. This will not be adopted in mass. The payback period needs to be less than 5 years for limited adoption and less than 2 years for rapid adoption.

You just defeating your own point. The Sterling system pays for itself in 7 years, which is what was stated. With subsidies it pays for itself in 2 years. You were implying that without subsidies it wouldn’t have worked.

Capacity charges are -real- costs. You can’t make a fictitious statement saying if there were no capacity or line charges batteries wouldn’t pay for themselves.

It depends on how tight the margins are in the industry. If they have extremely tight margins, it may mean the difference between breaking even and making a profit. If they have large margins, especially if they are a small, it may not be worth it to fiddle with.

We may be nearing a tipping point, but utilities are typically ultra-conservative. I imagine even if the numbers point in their favour, they will still be hesitant until they get back the data from the longevity/reliability of the currently installed projects before we see mass adoption. It is actually written into some of the utility bylaws.

The early adopters which is the small NE utilities and california is where they get the reliability and cost effectiveness data that they need to move forward.

We are getting closer and the technology is improving but I would look closer to 2025-30 for the tipping point. Even by some miracle we got a $10/kw battery, it is still going to be 10 years for the reliability data.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.