Cost of renewable energy’s variability is dwarfed by the savings

Wear and tear on equipment costs millions, but fuel savings are worth billions.

The variability of renewable energy sources like solar and wind has raised concerns about how well the US electrical grid could tolerate high levels of them. Some of the early estimates suggested that the grid couldn't handle having more than 20 percent of its electricity coming from intermittent sources without needing a major overhaul. But thanks to improved practices and a bit of experience, several states are already pushing that 20 percent limit well in advance of having a smart grid in place.

Adjusting for intermittent power sources primarily comes from cycling traditional fossil fuel plants on and off to match supply with demand. And that cycling has a cost in terms of wear and tear to equipment and fuel burned without producing electricity. So the National Renewable Energy Laboratory (NREL) produced a series of studies to look at these costs and how they compared to the savings in fuel that doesn't get burned. The answer: the cost is a tiny fraction of the ultimate savings.

Solar and wind power have very distinct profiles. Solar varies the most over the course of a day, but the general outline of solar production is very predictable even if the total power delivered varies a bit with cloud cover. Wind tends to be steadier, but the total amount being produced can change at any time of day.

To compensate for this variability, electricity suppliers essentially have to turn sources on and off. Since wind and solar have minimal operating costs—they burn no fuel—attention turns to coal and natural gas. Depending on the design of the plant, switching them on or off entails a variety of costs. Fuel gets burned without producing electricity when the plants cycle up, and a changing state entails an increased level of wear-and-tear on the equipment. Some of this went on before renewables entered the mix, but solar and wind are clearly increasing the frequency.

So, what are the costs? To find out, NREL commissioned a company called APTECH that had previously been hired by plant operators to estimate these costs. With these costs in hand, the NREL team analyzed the grid in the Western US under a number of different scenarios where intermittent renewables accounted for 33 percent of the total power. These scenarios included an even split between wind and solar sources and both 25 percent/eight percent (wind/solar and solar/wind) splits.

As expected, costs did go up. Cycling the fossil fuel plants added between $0.47 and $1.28 to each MegaWatt hour generated. Over the course of a year in the Western US grid, that adds up to between $35 and $157 million, a boost of between 13 and 24 percent.

That's the bad news. The rest is pretty much good. The fuel savings from not running the fossil fuel plants adds up to $7 billion, meaning the added costs are, at most, two percent of the savings. The fuel burned when spinning up the fossil fuel plants also makes a minimal contribution to pollution, either in the form of CO2 or in terms of nitrogen and sulfur compounds.

Perhaps the most significant news, however, is that the worst problems come earlier in the transition to renewables. "In terms of cycling costs," the report notes, "there may be a big step in going from 0 percent to 13 percent wind/solar but a much smaller step in going from 13 percent to 33 percent." In other words, once the percentage of renewables reaches a critical point, then the amount of adjustments we have to make becomes incremental.

This doesn't yet mean that all renewable power is cost effective compared to fossil fuels; wind is very close, but solar is a bit further. With current trends, however, we're only a few years away from that point. And this report indicates that once we get there, there won't be any significant additional costs to adding them to the grid.