Water and nuclear power

Once the heat generated by nuclear fission has finished spinning the turbines in nuclear power plants, it must somehow be dissipated into the wider environment. Almost invariably, this is done using large amounts of water drawn from nearby rivers and lakes. Now, for plants located in drought-struck regions such as the southeast United States, possible water scarcity threatens to shut down plants, forcing the costly purchase of energy from other jurisdictions.

The Associated Press estimates that 24 of America’s 104 nuclear reactors are located in areas currently experiencing severe drought. On reactor outside Raleigh, North Carolina will need to be shut down if water levels in the lake fall by another 3 1/2 feet. In total, nuclear power provides about 10% of the American supply of electricity. All but two American nuclear plants are cooled using water from lakes and rivers. Some plants evaporate large amounts of water from cooling towers, while others are designed to return the warmed water to the body that originally provided it. Immersing collection pipes at lower levels risks being costly, as well as increasing problems from sediment intake into the cooling system.

All this demonstrates the degree to which many forms of low-carbon energy generation are themselves vulnerable to climate change. Concern about water being a limiting factor in energy production is already acute in Australia. Dams face risks from both drought and the loss of snowpack in mountain ranges (leading to too much water at some times of year and not enough at others). Even wind turbines may be vulnerable to changes in dominant patterns of air circulation. Designing future infrastructure with possible climate changes in mind is essential, if we are not to find ourselves with a lot of expensive hardware rendered useless by changed conditions.

Why are reactors designed to dissipate so much heat as waste? Couldn’t it be captured by stirling engines or some other such mechanism at a relatively modest cost?

Hmm, a question I am well-qualified to answer…. There is no lack of energy on Earth, but rather a huge problem of energy conversion. As you correctly deduce, our inefficient “machines” produce a lot of waste heat.

I actually work on thermoelectric energy conversion, which is a way of converting waste heat into electrical energy. There are a few reasons why thermoelectrics aren’t more common.

The first is that they aren’t generally terribly efficient. A very good thermoelectric is ~10% efficient, and that’s with efficiency given as a percentage of Carnot efficiency. See, you can’t turn thermal energy into work, but rather thermal gradients, and the theoretical maximum efficiency (Carnot) is T_H-T_C/T_H. Which means there’s not a lot of energy to get out of waste heat.

The second reason is cost; thermoelectrics, solar, etc. remain substantially more expensive than burning fossil fuels. The main costs are production, materials that go into devices, and research costs (my research is expensive, and while my salary is but a blip, I do have to eat, y’know).

The third reason is closely related to the first two; in order for a technology like thermoelectrics (or a Stirling engine or whatnot) to really matter on a global scale, you would need massive amounts of area converting waste heat. It’s simply not feasible or viable with current technologies. Give me a few years, and maybe I can help improve that.

In a mushy non-scientific way, increasing entropy means a decreasing difference between regions; this difference can be about heat.

There is a thought-machine called Carnot’s heat engine that describes the maximum amount of mechanical work you can get from a given difference in the temperatures of a “source” and “sink”. As regions of high heat and regions of low heat become less differentiated – in other words, as the difference in temperatures decreases – the payoff in mechanical work of “harnessing” the difference gets lower per amount spent on the hardware.

For any degree of hardware sophistication there is a limit beyond which it doesn’t pay to go after that “waste” heat.

Nuclear reactors across the U.S. Southeast could be forced to slow production or shut down in the near future due to the effects of continuing drought in the region. Nuclear power plants require massive amounts of water to cool steam that turns the generators; the water usually arrives via large intake pipes from nearby rivers and lakes. However, with water levels at drought-induced lows, a growing number of reactors are inching closer and closer to the water levels that would hamper plant operation. Pumping water from shallower depths, even when available, can also lead to forced shutdowns due to the water’s increased temperature. “You need a lot of water to operate nuclear plants,” said Jim Warren, executive director of a North Carolina green group. “Water is the nuclear industry’s Achilles’ heel.” By our count that makes at least four such heels: water, the legacy of radioactive waste, nuke plants’ appeal as terrorist targets, and the enormous costs of nuke plant construction.

The power of water
French government interested in solar because it uses less water than nukes
Posted by Adam Browning

A year or so ago, I spoke at a solar conference in France — a country that produces 78 percent of its electricity with nukes. A couple of folks told me that the government’s interest in solar stemmed from the fact that during the previous summer’s heat wave, river levels dropped to the point that they didn’t have enough water to cool the reactors. The country actually had to shut off generation exactly at peak demand. Big problem. Thus, solar photovoltaics, which not only generate most during these peak events, but also … use no water.