The EIA tries to account for the cost of energy and the cost of replacing energy.

Share this story

It’s not always a simple task to compare the value of electricity generation resources. Coal, natural gas, solar, wind, and so on have different strengths and weaknesses, so when it comes time to build or replace energy capacity, economists look at the Levelized Cost of Energy (LCOE), which divides the total cost of an installation or plant by the kilowatt-hours it produces over its lifetime.

While private financial firms like Lazard calculate their own LCOE figures, the Energy Information Administration (EIA) also puts together an annual report projecting the LCOE for various generation resources. The report, released this month, looks at the cost of generation resources if they were to come online in 2019, 2022, and 2040.

The latest numbers seem to confirm trends that have borne out recently in energy markets—overall, some renewables are getting more attractive, others are struggling, and coal has definitely been unseated as king.

The EIA calculated LCOE for generation resources in 22 regions in the US and then took the simple average of those regions for a national number. Once federal tax credits and energy regulations are factored in, a rough estimate of cost per unit of energy produced is possible. In general, the lower your LCOE, the better, because it usually means you’re getting more kWh for your dollar. (There are exceptions, though. If the LCOE analysis is being applied to, say, a solar installation on a building, investing in efficiency upgrades for your building before installing a solar array is undoubtedly the right move, even though it will increase your cost marginally and reduce the amount of kWh you need. More on the limitations of the LCOE are below.)

Cost of energy 101

EIA’s 2019 analysis is limited, because plants that will be deployed by then are already underway. Coal plants, offshore wind, geothermal energy, and nuclear power are not represented in the 2019 data. Because builders can still take advantage of federal tax credits, onshore wind, and solar photovoltaic (PV) resources are dirt cheap, at $39.30/MWh and $58.80/MWh respectively.

The more interesting figures are found for 2022 and beyond. Five years in the future, the EIA thinks the most expensive energy resources will include:

Solar thermal plants at $184.40 per MWh

Offshore wind at $145.90 per MWh

Coal plants with 30 percent carbon removal capability at $140 per MWh

Coal plants with 90 percent carbon removal at $123.20 per MWh

The EIA only found LCOE numbers for coal plants with some degree of carbon capture because of recent updates to the Clean Air Act which require plants to meet specific CO2 emission standards. Coal plants that can't meet those emissions standards can't be built currently.

On the other hand, electricity generation resources that will be on the cheaper end when they go into service in 2022 include:

Geothermal: $43.30 per MWh

Onshore wind: $52.20 per MWh

Advanced combined-cycle natural gas-burning plants: $56.50 per MWh

Solar PV: $66.80 per MWh

Geothermal was the cheapest of all resources in the EIA's analysis after factoring in tax credits that are expected to extend until 2022.

Advanced nuclear plants being deployed in 2022 hit squarely between the low end and the high end, at $99.1 per MWh.

Fast forward to 2040, and the EIA thinks that prices aren't going to change much from the 2022 projections. There are a few exceptions, of course. Some of the energy that gets cheaper includes:

Coal with any amount of carbon capture drops by $10 to $20 per MWh due to falling capital costs (meaning perhaps it will get easier and cheaper to build carbon removal technology).

Nuclear projects, whose cost decreases $10 per MWh because of a considerable decrease in capital costs, despite the EIA projecting that variable costs for nuclear energy, including fuel, will go up.

Offshore wind also decreases by $20 per MWh due to falling capital costs—a trend we’re already seeing today as the cost to install offshore wind facilities is getting cheaper and cheaper with expertise and investment developed in Europe, where offshore wind is more prevalent.

On the other hand, the cost of geothermal energy increases by about $10/MWh due to an increase in fixed maintenance costs.

But this is hardly the whole story

But, of course, comparing electricity-generation resources is not quite that simple if you want the most compelling and full story on which resources are better deployed in any given region of the US.

The EIA says that looking at the LCOE alone is an imperfect way to study energy-generating plants because it doesn’t directly compare the cost of each kilowatt-hour from a new plant with the cost of kilowatt-hours being displaced.

This comparison, found in what's called the Levelized Avoided Cost of Energy, or LACE, can help economists and analysts make decisions if, say, a resource like solar can only operate during the day or a resource like wind sees fluctuations based on the seasons. A solar installation's usefulness is subject to the availability of sunlight in a given region and might not be able to compete with a more reliable natural gas plant, but solar electricity can also be less expensive to produce during the middle of a hot August day than electricity from a fossil fuel-burning peaker plant that requires a startup period and cooling. If a resource is displacing power that’s more expensive to run when that first resource would run, that will be reflected in a LACE analysis.

The EIA said it looked at 22 different regions in the US and drew up LACE values for each type of energy by comparing that generation project with a new generation project that would be most likely to be built given the existing energy mix in that region.

Once LACE values are drawn up, the EIA says it’s instructive to compare the LACE and the LCOE of a project. If the avoided cost is greater than the levelized cost (LACE - LCOE > 0), then that resource is attractive for an energy company to build.

The EIA writes that the “net difference between LACE and LCOE provides a reasonable point of comparison of first-order economic competitiveness among a wider variety of technologies than is possible using either LCOE or LACE tables individually.”

Here, LACE-LCOE values provide interesting insight into what might get built in 2022 and 2040 (2019 was left out of this calculation).

Enlarge/ This graph from a 2013 EIA presentation shows the relationship between LCOE, LACE, and installed capacity. As LACE gains on LCOE, investors find it more attractive to install that kind of capacity. (The actual numbers in the graph are out of date though; currently the EIA projects that by 2022 the LACE for onshore wind will exceed its LCOE on average.)

We'll start with the projected losers for whom the average LACE does not exceed the LCOE in 2022. These include:

Coal with 30 percent carbon removal: -81.3

Offshore wind: -88.1

Solar thermal: -114.5

The EIA projects that, in most regions, these forms of energy won't be attractive to build without some other outside factor being present.

Winners are energy resources with a LACE that's greater than their LCOE. Geothermal appears to be the most attractive energy generation to build with a LACE-LCOE difference of 21.9.

Other technologies toe the line between favorable and unfavorable, likely reflecting some considerable regional variation for whether that resource is attractive to build or not.

Advanced combined cycle natural gas plants have a LACE - LCOE of 1.7, with a range of difference between -4.2 and 9.

Onshore wind has an average net difference of 1, but the regional range in differences is larger, between -17.4 and 20.9.

Solar PV, with an average net difference of -2, has an even wider range, between -42.5 and 21.4, meaning that some areas of the country will really want to build out more solar, and others really won't.

The EIA's projections for 2040 seem to narrow: coal with 30 percent carbon removal has an average net LACE-LCOE difference of -56.1, so it's unattractive to build, but less so than 18 years prior. Offshore wind is at -63.3, and solar thermal is at -112, but onshore wind’s average net difference comes up to 3.3 and solar PV’s hits 8.2. Advanced combined cycle natural gas plants are, apparently, still attractive to build that far out in the EIA’s projection, coming in at 7.8.

The future can be guessed at, not known

There are some obvious winners and losers: geothermal, natural gas, onshore wind, and solar PV in the former, and coal, nuclear, solar thermal, and offshore wind in the latter. But all of these estimates are projections, and they all factor in tax credits that are set to expire in five years or less.

It's hard to tell exactly what will happen by 2022—we know that the EIA's projections about wind and solar were too conservative even a couple of years back. Already we know that advances in offshore wind turbines and wind pattern projection could make that form of energy much cheaper than it is now. And if removing carbon from a coal plant's emissions becomes more economical, that could change the LACE-LCOE divide, too. For now, however, the EIA's analysis is a good way to get a broad overview on future investment in the US energy mix.

This very reason is why my Mom is going solar on her farm. She just got the solar panels two days ago and installation begins next week. Her goal is zero dollar average on the electric bill for the year. The local electric company does not pay for sending electricity back into the grid, they only provide credits. And so any excess my Mom produces wouldn't net her a direct benefit.

If (more like when) she buys a Tesla, she will get a few more panels installed to compensate for the additional electric usage.

I look forward to watching the construction and installation process.

Now if only North Dakota didn't just pass a law restricting new Wind Farms. The state government decided they were being unfair to coal and cut back on all wind farm developments. Killing a windfarm deal Mom had going for the farm land.

I would place first fusion reactor coming online in the next 4 years (private, public or secret ), my favorite is still polywell.

Is this just based on unicorns. 20 years ago people would have said the same thing. Even if/when you get to a place where a reactor is producing a net output of energy that doesn't mean it is viable. Most experts agree a Q of 20 (20 units of energy out for every unit of energy in) is the minimum viable design. Once you have a viable design it doesn't mean it is an optimized commercial reactor (capable of running 24/7 with minimal downtime for decades). Once you can build the first commercial reactor how long would it take to become cost competitive with existing power sources. Once it is cost competitive how long would it take to become significant share of world power.

Honestly I will be very surprised if we have a commercial fusion reactor operating in my lifetime. I think yes on a long enough timeline fusion is an important power source (especially for power off planet) but 4 years i just nonsense.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

1 depends on location. In CT you can get a check at end-of-year for overproduction, at least on commercial projects. That said what you get is a pittance, and well-designed systems will not significantly over-produce.

2 is correct for now, hopefully batteries will get more reasonable sometime within my lifetime

One design only captures 30% of CO2 emissions and would still be considered a high emitter relative to other new sources and thus may continue to face potential financial risk if carbon emission controls are further strengthened. Another design captures 90% of CO2 emissions and would not face the same financial risk, and therefore does not receive the 3 percentage point increase in cost of capital. As a result, the LCOE values for the coal-fired plant with 30% CCS are higher than they would be if the same cost of capital was used for all technologies

I'm a little disappointed that the EIA doesn't appear to be taking into account the environmental costs of energy generation. I recognize this study was not geared toward that but once you account for environmental damages it seems like Coal has already lost and solar/wind/geothermal are winning.

Also, why is hydro not mentioned at all? Has the US stopped building Hydro power plants?

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

Many states allow backfeeding to the grid and even make it law that you have to be paid. When I was stationed in Louisiana we received a check from the power company every month from April to September.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

Well most consumers get a net metering type contract with the power company. This is almost certainly in your favor -- even if you overproduce!

The alternative would be that the power company would purchase your excess electricity at wholesale rates but charge you for power you consume at retail rates. This means that for every extra kWH you produce during the day you might sell it to the power company for $0.05 but when you buy back that kWH at night you might pay $0.15.

Unless you install a hugely oversized array you will be better off with net metering. Also, the fact that the power company is essentially forced to purchase your excess power at retail rates is one of the reasons that many of them are looking to impose really strange tarrifs to get more out of solar customers. Our local company wants to bill net metering customers single highest consumption hour during a month at 100x normal rates.

Over the past 15 years, a number of predictions—by the International Energy Agency, the US Energy Information Administration, and others—have been made about the future of renewable energy growth," the Meister report noted. "Almost every one of these predictions has underestimated the scale of actual growth experienced by the wind and solar markets. Only the most aggressive growth projections, such as Greenpeace's Energy [R]evolution scenarios, have been close to accurate."

In addition, the economists are not as reliable as the engineers - i.e., looking at energy and material inputs tells you more about what's possible. How much energy and material does it take to set up a gigawatt of solar PV or wind turbine capacity, on average? That's the critical factor; economic costs are highly flexible in comparison and economists can fudge the numbers any way they like by how they define subsidies, inflation, etc.

This is also why carbon capture from fossil fuels is so shady - the coal industry and the Department of Energy never reports on how much energy it takes to capture a ton of coal emissions, vs. how much energy is generated from burning a ton of coal, in their various pilot projects. I mean never. All the CCS projects have been done under public-private DOE-industry partnerships, and they always call that critical number, kWh/ton of coal vs kWh/equivalent CO2 capture, "proprietary intellectual property".

Over the past 15 years, a number of predictions—by the International Energy Agency, the US Energy Information Administration, and others—have been made about the future of renewable energy growth," the Meister report noted. "Almost every one of these predictions has underestimated the scale of actual growth experienced by the wind and solar markets. Only the most aggressive growth projections, such as Greenpeace's Energy [R]evolution scenarios, have been close to accurate."

In addition, the economists are not as reliable as the engineers - i.e., looking at energy and material inputs tells you more about what's possible. How much energy and material does it take to set up a gigawatt of solar PV or wind turbine capacity, on average? That's the critical factor; economic costs are highly flexible in comparison and economists can fudge the numbers any way they like by how they define subsidies, inflation, etc.

This is also why carbon capture from fossil fuels is so shady - the coal industry and the Department of Energy never reports on how much energy it takes to capture a ton of coal emissions, vs. how much energy is generated from burning a ton of coal, in their various pilot projects. I mean never. All the CCS projects have been done under public-private DOE-industry partnerships, and they always call that critical number, kWh/ton of coal vs kWh/equivalent CO2 capture, "proprietary intellectual property".

That's the beauty of carbon capture. It sucks up so much energy that you have to mine and burn a s***-ton more coal to get the same effective energy output. What's not to like? /s

I am much more in favor of solar than wind, but only because bad experiences with having my parents place (where I grew up) being located in the middle of a big wind park. The sound measurements to find out how much sound they made were a joke, the contractor was more or less dishonest in their proposal (vastly undervalued costs of building roads is one example) and when the price of electricity fell down it suddenly wasn't worth it any more. Not to mention how they promised jobs and they (or they and the state) educated technicians, but they are not the ones who get the jobs and not to mention some courses that was supposed to led to jobs just didn't because obviously you can't handle that kind of education in six months.

Solar you can do on your own and in a less intruse way, and that I am very much in favor of. Wind power definitely have a place, but it is ... not nice at all to have it happen close to home, at least not on a big scale.

Still a problem with ROI for the "rest of us" aka wanting solar PV, VAWT, and rebates.

Seems anyone that can afford it, already has it.

And then there are the "state lawmakers" in the way. E.G. ... you cannot be off grid in NJ. You can if you want, but you still have to pay utility a monthly connection cost...even if you are self-sufficient. Nice, huh?

I calculated that 15 PV panels are 1/2 ton of weight (braces, trusses, bases, wiring, MicroInvertors...) and a roof I could put 10-15 on can't handle it. So I wait for future PV that are at 400W panels (or greater) and 2-3kW VAWT...there is always wind in the winter month, and breezes in the evening. When Tesla battery storage can work...

I think retirement will be along the lines of, 1-5 acres with shipping containers as home. PV, Wind, and wood. Get off my lawn! /s

One design only captures 30% of CO2 emissions and would still be considered a high emitter relative to other new sources and thus may continue to face potential financial risk if carbon emission controls are further strengthened. Another design captures 90% of CO2 emissions and would not face the same financial risk, and therefore does not receive the 3 percentage point increase in cost of capital. As a result, the LCOE values for the coal-fired plant with 30% CCS are higher than they would be if the same cost of capital was used for all technologies

This is why I would have liked to see the unsubsidized numbers mentioned in this article along with the numbers after taking into account tax credits and the like. That's not perfect either, but at least you can get an idea of the magnitude of the subsidy as a percent of overall costs. I haven't yet gone and read the EIA report though, and assume those numbers will be reported there.

I question a lot of things about these numbers, but the biggest thing I want to see is a comparison with unsubsidized costs. It's unfair to compare the cost of a technology that's subsidized to one that isn't. Even if that subsidy is just a "tax credit," it's still an uneven playing field. Subsidies don't actually make a technology cheaper, as someone still pays that cost in the end. And yes, the tax-payers are probably paying it either way, since tax costs are passed on to consumers, but it's still a component of the overall cost. (The economic impact of direct costs vs. redirected-through-taxation costs is a topic for another day.)

While I'm generally opposed to subsidization, as it warps the free market, it's not always a bad thing. If, for instance, subsidization now will be result in an overall decrease in long-term cost including the subsidy, it can be justified. You have to justify picking the economic winner first, which is difficult but possible. However, if it's just a matter of accelerating development of superior (e.g. cleaner) technology, I'd rather fund research than directly affect the market with a subsidy.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

1 depends on location. In CT you can get a check at end-of-year for overproduction, at least on commercial projects. That said what you get is a pittance, and well-designed systems will not significantly over-produce.

2 is correct for now, hopefully batteries will get more reasonable sometime within my lifetime

With Solar Panels you absolutely need to overproduce (depending on the install type and size) because the output degrades over time.

I am wondering why Solar Thermal has such a low return - is it because this is a new technology or are there more basic reasons?

However, if it's just a matter of accelerating development of superior (e.g. cleaner) technology, I'd rather fund research than directly affect the market with a subsidy.

The problem is avoiding a tragedy of the commons situation where nobody wants to put up the money for the initial research. And as far as directly affecting the market, you've already got the problem that most of the fossil fuel technologies are indirectly subsidized through unmitigated negative externalities. Economically, subsidies probably aren't the best way to address the problem, it'd be far better to tax fossil fuels to incorporate their environmental and health impacts into their price. But that's a political non-starter for now, so subsidies are the way to go.

I would place first fusion reactor coming online in the next 4 years (private, public or secret ), my favorite is still polywell.

Is this just based on unicorns. 20 years ago people would have said the same thing. Even if/when you get to a place where a reactor is producing a net output of energy that doesn't mean it is viable. Most experts agree a Q of 20 (20 units of energy out for every unit of energy in) is the minimum viable design. Once you have a viable design it doesn't mean it is an optimized commercial reactor (capable of running 24/7 with minimal downtime for decades). Once you can build the first commercial reactor how long would it take to become cost competitive with existing power sources. Once it is cost competitive how long would it take to become significant share of world power.

Honestly I will be very surprised if we have a commercial fusion reactor operating in my lifetime. I think yes on a long enough timeline fusion is an important power source (especially for power off planet) but 4 years i just nonsense.

Even after Fusion generation is a solved problem (assuming that this ever happens), it needs to beat Wind at baseload generation costs or be relegated to niche applications.

I'm a little disappointed that the EIA doesn't appear to be taking into account the environmental costs of energy generation. I recognize this study was not geared toward that but once you account for environmental damages it seems like Coal has already lost and solar/wind/geothermal are winning.

Also, why is hydro not mentioned at all? Has the US stopped building Hydro power plants?

It's an *economic* analysis.

However, indirectly it *is* including environmental costs, because it makes an attempt to account for legal incentives / disincentives for different methods of production, which are generally based on the environmental costs.

Remember that the EIA projections are for UTILITY SCALE generation, not home generation. Those Solar PV costs are for UTILITY SCALE generation. Also I don't see the EIA modeling battery storage for the PV whereas I expect integrated battery storage to become a requirement in coming years.

The results are interesting. I fully expected the numbers they got for Solar PV and Onshore Wind and was a little surprised that their numbers for Solar Thermal and Offshore Wind painted a very negative pictures for those generation sources. I was a little surprised that Nuclear wound up costing so much but realized after a time that a nuclear power plant requires a *LOT* of employment. And I was also surprised that coal was costed out to be so much more expensive than natural gas.

A Solar PV installation, on the otherhand, requires essentially 0 regular (daily) employees, because it is nearly all solid state. Only a mandatory battery storage system has any moving parts (the liquid electrolyte), and even that will probably go away and become just ion movement once the technology progresses to solid glass electrolyte. The maintenance costs come down to replacing voltage regulation electronics and other components as they wear out. A few employees can cover several major solar farms doing that work. Everything else is basically solid state (not counting the automatic panel cleaner which is just a cheap robot or the panel movers - if they bother with them at all, which is one tiny cheap motor every 100 panels or more).

And even better, there is no need to replace the whole plant when stuff gets old. The ENTIRETY of a Utility-scale Solar PV system can be replaced slowly, piecemeal, without having to take it offline.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

Well most consumers get a net metering type contract with the power company. This is almost certainly in your favor -- even if you overproduce!

The alternative would be that the power company would purchase your excess electricity at wholesale rates but charge you for power you consume at retail rates. This means that for every extra kWH you produce during the day you might sell it to the power company for $0.05 but when you buy back that kWH at night you might pay $0.15.

Unless you install a hugely oversized array you will be better off with net metering. Also, the fact that the power company is essentially forced to purchase your excess power at retail rates is one of the reasons that many of them are looking to impose really strange tarrifs to get more out of solar customers. Our local company wants to bill net metering customers single highest consumption hour during a month at 100x normal rates.

All the weird games utilities play with solar bug me. Don't get me wrong, if net-metering is a subsidy, and it is or could be in some regions, then they shouldn't have to continue it indefinitely.

At the same time, they should credit excess generation based on actual use. So if I generate an excess kWh, and my neighbor uses that kWh, the utility should credit me the amount they would have charged for that kWh, less the levelized cost of distribution and administration. It won't be the full amount, but it also shouldn't be the wholesale price they're paying a large generator hundreds of miles away because the aren't transmitting that energy from the generator to my neighbor, they're just providing distribution.

They should also be required to provide real-time energy and transmission prices so that I, as a user, can decide whether or not I want to reduce load and/or distribute more energy to my neighbors (or to the grid as a whole) depending on the wholesale price of energy and transmission. Charging users 100x the prevailing rate for their hour of highest usage in a month to deal with demand costs is BS. Just let users have access to real-time energy/transmission prices in exchange for being charged real-time energy/transmission costs so they can decide what they want to do.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

1 depends on location. In CT you can get a check at end-of-year for overproduction, at least on commercial projects. That said what you get is a pittance, and well-designed systems will not significantly over-produce.

2 is correct for now, hopefully batteries will get more reasonable sometime within my lifetime

With Solar Panels you absolutely need to overproduce (depending on the install type and size) because the output degrades over time.

I am wondering why Solar Thermal has such a low return - is it because this is a new technology or are there more basic reasons?

You also need to overproduce because production is high in the summer, and low in the winter. Where I live, if you are feeding power back to the grid, you get credit for a trailing 1 year average. So if you push a bunch of watt-hours back to the grid in the summer, you can pull those back out in the winter, when your panels are underproducing your needs.

If you don't have a grid situation like that - you need to overproduce even more in the summer, so you have enough power in the winter.. (Unless you live on the equator where power is more even all year round!).

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

1 depends on location. In CT you can get a check at end-of-year for overproduction, at least on commercial projects. That said what you get is a pittance, and well-designed systems will not significantly over-produce.

2 is correct for now, hopefully batteries will get more reasonable sometime within my lifetime

With Solar Panels you absolutely need to overproduce (depending on the install type and size) because the output degrades over time.

I am wondering why Solar Thermal has such a low return - is it because this is a new technology or are there more basic reasons?

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

Well most consumers get a net metering type contract with the power company. This is almost certainly in your favor -- even if you overproduce!

The alternative would be that the power company would purchase your excess electricity at wholesale rates but charge you for power you consume at retail rates. This means that for every extra kWH you produce during the day you might sell it to the power company for $0.05 but when you buy back that kWH at night you might pay $0.15.

Unless you install a hugely oversized array you will be better off with net metering. Also, the fact that the power company is essentially forced to purchase your excess power at retail rates is one of the reasons that many of them are looking to impose really strange tarrifs to get more out of solar customers. Our local company wants to bill net metering customers single highest consumption hour during a month at 100x normal rates.

All the weird games utilities play with solar bug me. Don't get me wrong, if net-metering is a subsidy, and it is or could be in some regions, then they shouldn't have to continue it indefinitely.

At the same time, they should credit excess generation based on actual use. So if I generate an excess kWh, and my neighbor uses that kWh, the utility should credit me the amount they would have charged for that kWh, less the levelized cost of distribution and administration. It won't be the full amount, but it also shouldn't be the wholesale price they're paying a large generator hundreds of miles away because the aren't transmitting that energy from the generator to my neighbor, they're just providing distribution.

They should also be required to provide real-time energy and transmission prices so that I, as a user, can decide whether or not I want to reduce load and/or distribute more energy to my neighbors (or to the grid as a whole) depending on the wholesale price of energy and transmission. Charging users 100x the prevailing rate for their hour of highest usage in a month to deal with demand costs is BS. Just let users have access to real-time energy/transmission prices in exchange for being charged real-time energy/transmission costs so they can decide what they want to do.

For home solar installations (which the EIA report has nothing to do with), Utility companies are struggling with the fact that more affluent households will be able to go almost completely off-grid within the next 10 years and leave them high-and-dry. That's the crux of the situation. Higher-use households are slowly being removed from the equation in terms of being able to subsidize poorer households. Hawaii is already facing this problem.

--

Right now, Utilities are trying to devalue generation to the grid by home PV systems by using the argument that peak demand occurs later in the day... well yah, partly because they aren't measuring (can't really measure) the home PV generation that feeds the house during the solar peak period (the panels go straight into the house via the inverter and never hit the utility meter). So as more home PV is installed, the peak energy use that the Utility sees continues to shift later into the day. Now of course it didn't entirely overlap peak solar even before solar, but it has slowly shifted since to the point now where the peak does not occur until basically after the sun has set.

But with the advent of electric vehicles (EVs) and in particular EV-class batteries coming down in price, within the next 10 years the average middle-class or higher home owner will actually be able to load-shift their solar either using an in-garage battery system or using their EV to offset their peak usage when they come home from work in the evening. It doesn't actually require a whole lot of KwH of capacity to load-shift... it just comes down to battery cycle life (which is one of the things that will improve massively over the next 10 years). It is not economically feasible at the moment for U.S. domestic home owners, but it will be in the next 10 years. Once the break-even time drops to < 15 years it becomes feasible.

At that point Utilities are going to be in serious trouble without a complete rewrite of the rules.

I am wondering why Solar Thermal has such a low return - is it because this is a new technology or are there more basic reasons?

Solar thermal is new technology that is very capital intensive and very location dependent. It doesn't make much sense to look at the average LCOE or LACE; investors should be looking at the costs in regions with optimal weather/insolation for solar thermal.

Capturing it is one thing, safely disposing of it another. There are some plants around which pump it into the ground. How long it will actually stay there is anybody's guess. My prediction is that one of these CO2 storage sites is going to have a "big burp" one day and the resulting CO2 gas cloud is going to kill so many people that the technology will become politically untenable.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2...

I believe that varies by state. This may have changed since I last looked at it, but at that point in NJ if your net draw from the grid was negative the power company had to pay you for what you gave back to the grid. Up to 100% of your usage they had to pay full retail price, anything beyond that they had to pay wholesale price.

1 if you produce excess the power company wont pay you for extra. They will store it with an on grid connection but you cant get enough cells to cover 110% of use and expect a paycheck every month

2 if you have an on grid connection and the grid goes out you dont have power. This prevents upsurges of electricity when the power lines are worked on. So true catastrphy backups need batteries

1 depends on location. In CT you can get a check at end-of-year for overproduction, at least on commercial projects. That said what you get is a pittance, and well-designed systems will not significantly over-produce.

2 is correct for now, hopefully batteries will get more reasonable sometime within my lifetime

With Solar Panels you absolutely need to overproduce (depending on the install type and size) because the output degrades over time.

I am wondering why Solar Thermal has such a low return - is it because this is a new technology or are there more basic reasons?

Compared to utility scale PV, solar thermal:

- Has more moving parts - Takes longer to begin generating after the sun rises (needs to heat up; PV actually works best when cold) - May require fossil combustion systems to get up to working temperature quickly - Works much worse under partly cloudy conditions (diffuse light is ok for PV, worthless for concentrating on a thermal tower) - Consumes more water - Costs more to build per unit of capacity - Can potentially store energy for delivery after sunset

Note that everything except the last point is a disadvantage. Solar thermal seemed a lot more attractive 20 years ago when 2-axis tracking mirrors were significantly cheaper than stationary PV modules. But solar thermal is significantly more expensive to build now than a stationary or 1-axis-tracking large PV farm. It also tends to produce significantly less energy over the course of a year.

The one thing that might justify thermal solar is thermal storage for delivering power after sunset. But most solar thermal plants in the US don't even have that. The Ivanpah plant that started operating in 2014, for example, doesn't have any storage capability. It also cost significantly more than a PV plant of equal capacity, has a lower capacity factor, and burns natural gas to warm up in the morning. Ivanpah is a turkey and we shouldn't build any more plants like it.

I question a lot of things about these numbers, but the biggest thing I want to see is a comparison with unsubsidized costs. It's unfair to compare the cost of a technology that's subsidized to one that isn't. Even if that subsidy is just a "tax credit," it's still an uneven playing field. Subsidies don't actually make a technology cheaper, as someone still pays that cost in the end. And yes, the tax-payers are probably paying it either way, since tax costs are passed on to consumers, but it's still a component of the overall cost. (The economic impact of direct costs vs. redirected-through-taxation costs is a topic for another day.)

While I'm generally opposed to subsidization, as it warps the free market, it's not always a bad thing. If, for instance, subsidization now will be result in an overall decrease in long-term cost including the subsidy, it can be justified. You have to justify picking the economic winner first, which is difficult but possible. However, if it's just a matter of accelerating development of superior (e.g. cleaner) technology, I'd rather fund research than directly affect the market with a subsidy.

Subsidies are a way to incentivize/seed and overall make it more attractive for private companies to invest in promising technologies. Most of the time the private market is wary of trying new things preferring to stay with the tried and true (Oil energy companies for example). In the interest of the community, the government can step in to make a particular section of the market more attractive, hopefully spurring investment, research, and product development. On a good day new products will come out, competition is stimulated, and the free market takes over.

In the case of solar, as it has been gaining ground, costs have gone down. Any subsidies should go away once the market is healthy and can compete on its own merit. Ideally of course. We still give subsidies to big oil after all...