Share this story

Visit a power plant in winter and you can literally see the heat pouring out of it as the hot air of its exhaust mixes unevenly with the cold air around it. It's easy to convince yourself that the mere burning of fossil fuels, rather than the greenhouse gasses the burning releases, could be altering the climate. But the total energy released is rather small on a global scale, so researchers have largely treated it as a rounding error. A new paper suggests that, although power generation doesn't add much heat, it changes the way the heat is distributed. And this has surprisingly large effects in some areas.

It's possible to estimate the total global energy use from all sources, then compare that to the factors that are known to drive the climate (called "forcings"). And it turns out to be rather minor; less than 0.1 Watt/m2 when averaged out over the globe. In contrast, the current forcing from human-delivered CO2 is about 1.5 Watts/m2. Given that, the direct heating of the atmosphere due to power consumption has largely been left out of climate models.

A paper, released over the weekend by Nature Climate Change, suggests we might want to put it back in. The authors note some urban areas show rather intense warming from power consumption (estimates for Tokyo get as high as 1,590 Watt/m2), which might be enough to alter the regional air circulation. Given that similar sources dot the continents, it's possible that the regional effects add up to a noticeable global shift.

The authors did a modeling experiment, taking the National Center for Atmospheric Research (NCAR) Community Atmosphere Model CAM3, and adding point heat sources to represent urban centers. Eighty-six of them were added, each at a point where urban centers are estimated to add more than 0.4 Watt/m2. The heat was added to the lowest layer of the atmosphere, below 130m. The models were given a standard set of human and natural influences and allowed to run for 100 years.

As expected, warming in general didn't change much. But the location and timing of the warming did. In the winter months (December, January, and February), some areas of the globe were up to 1K warmer than they were in control runs that didn't include the point heat sources. The warmest regions ended up over northeast North America, as well as more western regions of Arctic Canada. Another huge warm area was centered over Siberia.

These areas faded out during the spring. Over the summer and autumn, were actually somewhat cooler than the control simulations would indicate, although the changes outside of winter were much smaller in general.

The nice thing about this is it may explain some of the discrepancies between climate models and real-world data. Models have largely predicted that the Arctic would warm faster than other regions, and reality has borne that out. But the real-world data indicates these areas are warming even faster than the models suggest they would, a discrepancy that has remained unexplained.

The models also show the weather phenomena we should expect to see associated with these changes, such as an area of low pressure in the Russian Arctic associated with a high-pressure region in central Asia. That combination explains how warmer air gets shifted north, and it provides an example of what we should be looking for to confirm the predictions based on this model.

The authors recognize their efforts are a first attempt, meant to highlight an area where we need more work. They note their model only incorporated about 42 percent of the global heat output from energy use (the rest would probably be more diffuse), and that we don't have a strong grip on the actual emissions in many urban areas. The implications are that a more refined model might do even better at matching model predictions to the actual warming we've observed.

In any case, they make a strong argument that the heat from energy use probably shouldn't be ignored. It may not change the trajectory of warming significantly, but it can apparently help shape where it hits the hardest.

82 Reader Comments

Obviously, as you note below, anything that can use the waste heat can affect the combined efficiency (your generation efficiency is unchanged), but it seems to me that the deltaT for waste heat is going to be lower than from the original fuel as energy was extracted from it in the generation step, so any co-generation (or whatever they call it) must be at an lower efficiency.

Since it was originally waste heat, though, any energy extracted from it is purely a bonus and adds to the overall efficiency of the system. You end up generating more electricity from the same fuel, so the proportion of potential energy that goes into heat must be reduced. I would consider that an increase in generation efficiency.

That isn't what cogeneration is for. The point is that the steam you use to drive turbines or whatever slowly loses its mojo and the efficiency drops off for electric generation, but it's still hot enough for stuff like heating homes and distilling water (among other uses). That way, you save the expensive electricity for the things that need it.

In the context of the article, it doesn't change much. You're still carrying heat to urban/industrial places where it ends up being released.

You're right. I'm not doing very well with these examples at all, sorry.

There's still the example AR posted of the Combined Cycle, supposed to be up to 60%.

One last attempt at my original "very close to 100% in theory": Carnot's theorem

Given a sufficiently cold environment or a sufficiently extreme heat source, the efficiency of a very theoretical heat engine should get near 100%.

There's still the example AR posted of the Combined Cycle, supposed to be up to 60%.

One last attempt at my original "very close to 100% in theory": Carnot's theorem

Given a sufficiently cold environment or a sufficiently extreme heat source, the efficiency of a very theoretical heat engine should get near 100%.

Yes, the theoretical maximum efficiency goes up as deltaT goes up. So in theory engines used in Antarctica are more efficient than the same engine at the equator. The problem is, you may need starter engines to start the motor, you'll have to let it (and especially the oil) warm up before use, and due to the terrain and weather, you'll end up getting far less done with the same amount of fuel than you will in a warmer climate. Generally the vehicles used are large, heavy and probably use quite a bit of fuel. I have seen passenger cars at some of the Antarctic stations near the coast, but I don't see them at most places (you can see Mawson and several other stations by webcam).

Since the higher temperature is created by the combustion of fuel, the type of fuel determines the high temperature, and I don't think there are significant differences among the various fuels used, so the only way to get more thermal efficiency is to run the engine in a cold environment.

Yes, the theoretical maximum efficiency goes up as deltaT goes up. So in theory engines used in Antarctica are more efficient than the same engine at the equator....

Since the higher temperature is created by the combustion of fuel, the type of fuel determines the high temperature, and I don't think there are significant differences among the various fuels used, so the only way to get more thermal efficiency is to run the engine in a cold environment.

Sure, provided you control all the other "non-ideal" losses that you're likely to encounter:

1. Have to keep everything heated when the vehicle is parked. Batteries, coolant, transmissions, fuel. Significant electricity use.2. Even so, it probably takes forever to properly warm up the vehicle, luckily...3. You probably idle the vehicle a lot, which isn't very efficient, but it does keep the vehicle and its occupants warm.