Does Back-Radiation “Heat” the Ocean? – Part One

In the three part series on DLR (also known as “back radiation”, also known as atmospheric radiation), Part One looked at the network of stations that measured DLR and some of the measurements, Part Two reviewed the spectra of this radiation, and Part Three asked whether this radiation changed the temperature of the surface.

Very recently, on another blog, someone asked whether I thought “back radiation” heated the ocean. I know from a prominent blog that a very popular idea in blog-land is that the atmospheric radiation doesn’t heat the ocean. I have never seen any evidence for the idea. That doesn’t mean there isn’t any..

See note 1 on “heat”.

The Basic Idea

From what I’ve seen people write about this idea, including the link above, the rough argument goes like this:

solar radiation penetrates tens of meters into the ocean

atmospheric radiation – much longer wavelengths – penetrates only 1μm into the ocean

Therefore, solar radiation heats the ocean, but atmospheric radiation only heats the top few molecules. So DLR is unable to transfer any heat into the bulk of the ocean, instead the energy goes into evaporating the top layer into water vapor. This water vapor then goes to make clouds which act as a negative feedback. And so, more back-radiation from more CO2 can only have a cooling effect.

There are a few assumptions in there. Perhaps someone has some evidence of the assumptions, but at least, I can see why it is popular.

Solar Radiation

As regular readers of this blog know, plus anyone else with a passing knowledge of atmospheric physics, solar radiation is centered around a wavelength of 0.5μm. The energy in wavelengths greater than 4μm is less than 1% of the total solar energy and conventionally, we call solar radiation shortwave.

99% of the energy in atmospheric radiation has longer wavelengths than 4μm and along with terrestrial radiation we call this longwave.

Most surfaces, liquids and gases have a strong wavelength dependence for the absorption or reflection of radiation.

Here is the best one I could find for the ocean. It’s from Wikipedia, not necessarily a reliable source, but I checked the graph against a few papers and it matched up. The papers didn’t provide such a nice graph..

Absorption coefficient for the ocean - Wikipedia

Figure 1

Note the logarithmic axes.

The first obvious point is that absorption varies hugely with the wavelength of incident radiation.

I’ll explain a few basics here, but if the maths is confusing, don’t worry, the graphs and explanation will attempt to put it all together. The basic equation of transmission relies on the Beer-Lambert law:

I = I0.exp(-kd)

where I is the radiation transmitted, I0 is the incident radiation at that wavelength, d is the depth, and k is the property of the ocean at this wavelength

It’s not easy to visualize if you haven’t seen this kind of equation before. So imagine 100 units of radiation incident at the surface at one wavelength where the absorption coefficient, k = 1:

Figure 2

So at 1m, 37% of the original radiation is transmitted (and therefore 63% is absorbed).

At 2m, 14% of the radiation is transmitted.

At 3m, 5% is transmitted

At 10m, 0.005% is transmitted, so 99.995% has been absorbed.

(Note for the detail-oriented people, I have used the case where k=1/m).

Hopefully, this concept is reasonably easy to grasp. Now let’s look at the results of the whole picture using the absorption coefficient vs wavelength from earlier.

Figure 3

The top graph shows the amount of radiation making it to various depths, vs wavelength. As you can see, the longer (and UV) wavelengths drop off very quickly. Wavelengths around 500nm make it the furthest into the ocean depths.

The bottom graph shows the total energy making it through to each depth. You can see that even at 1mm (10-3m) around 13% has been absorbed and by 1m more than 50% has been absorbed. By 10m, 80% of solar radiation has been absorbed.

The graph was constructed using an idealized scenario – solar radiation less reflection at the top of atmosphere (average around 30% reflected), no absorption in the atmosphere and the sun directly overhead. The reason for using “no atmospheric absorption” is just to make it possible to construct a simple model, it doesn’t have much effect on any of the main results.

If we considered the sun at say 45° from the zenith, it would make some difference because the sun’s rays would now be coming into the ocean at an angle. So a depth of 1m would correspond to the solar radiation travelling through 1.4m of water (1 / cos(45°)).

For comparison here is more accurate data:

From "Light Absorption in Sea Water", Wozniak (2007)

Figure 4

On the left the “surface” line represents the real solar spectrum at the surface – after absorption of the solar radiation by various trace gases (water vapor, CO2, methane, etc). On the right, the amount of energy measured at various depths in one location. Note the log scale on the vertical axis for the right hand graph. (Note as well that the irradiance in these graphs is in W/m².nm, whereas the calculated graphs earlier are in W/m².μm).

From "Light Absorption in Sea Water", Wozniak (2007)

Figure 5

And two more locations measured. Note that the Black Sea is much more absorbing – solar absorption varies with sediment as well as other ocean properties.

DLR or “Back radiation”

The radiation from the atmosphere doesn’t look too much like a “Planck curve”. Different heights in the atmosphere are responsible for radiating at different wavelengths – dependent on the concentration of water vapor, CO2, methane, and other trace gases.

Here is a typical DLR spectrum (note that the horizontal axis needs to be mentally reversed to match other graphs):

But for interest I took the case of an ideal blackbody at 0°C radiating to the surface and used the absorption coefficients from figure 1 to see how much radiation was transmitted through to different depths:

Figure 7

As you can see, most of the “back radiation” is absorbed in the first 10μm, and 20% is absorbed even in the first 1μm.

I could produce a more accurate calculation by using a spectrum like the Pacific spectrum in fig 6 and running that through the same calculations, but it wouldn’t change the results in any significant way.

So we can see that while around half the solar radiation is absorbed in the first meter and 80% in the first 10 meters, 90% of the DLR is absorbed in the first 10μm.

So now we need to ask what kind of result this implies.

Heating Surfaces and Conduction

When you heat the surface of a body that has a colder bulk temperature (or a colder temperature on the “other side” of the body) then heat flows through the body.

Conduction is driven by temperature differences. Once you establish a temperature difference you inevitably get heat transfer by conduction – for example, see Heat Transfer Basics – Part Zero.

The equation for heat transfer by conduction:

q = kA . ΔT/Δx

where k is the material property called conductivity, ΔT is the temperature difference, Δx is the distance between the two temperatures, and q is the heat transferred.

However, conduction is a very inefficient heat transfer mechanism through still water.

So, as a rough guide, if you had a temperature difference of 20°C across 50m, you would get heat conduction of 0.24 W/m². And with 20°C across 10m of water, you would only get heat conduction of 1.2 W/m².

However, the ocean surface is also turbulent for a variety of reasons, and in Part Two we will look at how that affects heat transfer via some simulations and a few papers. We will also look at the very important first law of thermodynamics and see what that implies for absorption of back radiation.

Reference

Notes

Note 1 – To avoid upsetting the purists, when we say “does back-radiation heat the ocean?” what we mean is, “does back-radiation affect the temperature of the ocean?”

Some people get upset if we use the term heat, and object that heat is the net of the two way process of energy exchange. It’s not too important for most of us. I only mention it to make it clear that if the colder atmosphere transfers energy to the ocean then more energy goes in the reverse direction.

Like this:

Related

37 Responses

The whole notion is misconceived. Heat at the ocean surface is flowing the other way (balanced by SW). The IR flux upward exceeds the downward flux.

Of course, the ocean does radiate thermal IR. Seen from space, etc.

Any notional difficulty about absorbing IR would apply equally to the upward IR. There isn’t any. Yes, IR radiative transfer ceases within a few microns. But conductive transport is effective over longer distances, and beyond that, turbulent advection takes over. Heat flux occurs over the phase boundary with no great temperature gradient.

I think the “argument” as it goes isn’t that the ocean doesn’t have absorptivity or emissivity at these wavelengths – just that the heat absorbed “because it is in the top micron” **therefore** goes into evaporating the top layer.

I’m finding it difficult to make sense of Nick’s statements:
“Any notional difficulty about absorbing IR would apply equally to the upward IR. ”
and:
“The down IR just makes up some of the heat the top layer is losing by radiating. The rest comes by conduction and advection from below, with no great difficulty.”

Firstly I’d like to know what evidence Nick has that the surface of the ocean emits IR rather than the molecules of water vapour just above the surface. If he thinks both the surface and the water vapour are emitting IR then I’d like to know the proportions of each.

Secondly, how does he propose that downward IR goes about “making up the heat”. We know it can’t penetrate the surface much beyond it’s own wavelength. We also know that the thermal conductivity of water is very poor. We also know that turbulent mixing in wave structures takes place mostly further down inside waves, well beyond the distance IR gets to.

tallbloke,
I’ve put a a post on my website which may make it clearer. It shows the temperature gradients in the top surface layers. In the last mm the gradient, day and night, is negative upward, indicating that the hrat flow is upward.

So, evidence that the surface emits the IR – wherever the IR is emitted from, heat has to reach it from some source, and this is mostly the lower depths (where the sunlight was absorbed). If the IR was emitted from vapor above, how does the heat get there to keep the vapor from cooling?

Downward IR “makes up the heat”. The length scale for absorption is the same as the scale for emission – comparable to a wavelength. Heat is being removed by emission, and created by thermaiisation of downwelling IR. The two processes must add.

I didn’t see the definition of convection in your link – is it in the comments? But I’m referring to turbulent transport by advection in eddies. The eddies are forced by ocean motions generally, and particularly wind stresses near the surface. They scale down towards the surface, but are quite adequate to convey the upflux of heat which is radiated, and would be also capable of conveying a similar downflux, although that rarely arises.

I’ve only recently found your blog and am impressed by its calm tone and attention to detail.
I’m also fairly new to the subject matter; for a newcomer like me a statement like “The energy in wavelengths greater than 4μm is less than 1% of the total solar energy and conventionally, we call solar radiation shortwave.” could give rise to confusion. I would feel less liable to confusion if radiation was classified according to the nature of the waves (infra-red/visible/ultraviolet) rather than according to their source. Is solar infrared radiation negligible (as compared with, say, terrestrial or atmospheric infrared)? Is it, like light, also described as short-wave? Perhaps I need a glossary of these overlapping usages. I spent months of confusion wondering what people meant by ‘top of the atmosphere’. Sometimes they seemed to be referring to the tropopause.

If I’d noticed your post sooner, I would have made a comment very similar to Nick’s first post here, but let me try to rephrase my way which might help communicate the point to a broader group…

From Kirchhoff’s thermal radiation law, the emissivity of any given ocean layer at a given wavelength is the same as absorptivity. So ScienceOfDoom’s discussion above about absorption of downwelling atmospheric radiation of different wavelengths by the ocean applies equally well to *emission* of upwelling radiation to the atmosphere from the ocean.

If the temperature of that given ocean layer is higher than the effective atmospheric temperature associated with the downwelling radiation, then the emitted radiation from that layer will be greater than the absorbed radiation, and the net flow between ocean and atmosphere will be from ocean to atmosphere, cooling off the ocean slightly. The effect of the absorption of downwelling radiation is best viewed as, the same as for land surface, a *reduction in the rate of cooling* via radiation.

If the temperature of the ocean layer is less than the effective atmospheric temperature (likely only for limited wavelength ranges where atmospheric absorption and emission is high enough so the emission comes from low altitudes that really are warmer) then the absorption will be greater than emission, and the downwelling radiation from the atmosphere will warm the ocean.

It appears to me the difference between what Nick and Science of Doom say is in fact related to that dull point made in Note 1 (definition of heating). The back radiation does exist but the the net radiation balance can be up or down depending on if the air is cooler or warmer than the water. Generally the water is warmer, so on the average the net radiation flux is water to air. Energy is transported either up or down by conduction driven by convection by the air water boundary depending on which is hotter, but energy is always transported up into the air by evaporation. I don’t necessarily agree with Nick that the top surface is always cooler than the immediate layer below. If humidity is near 100%, the net evaporation rate goes to near zero. In addition the air may be so close to the surface temperature that net radiation flux is near zero. In that case, absorbed solar radiation may make the top surface the warmer layer.

The reason for expecting a negative upwards temperature gradient at the surface is based on steady state flow balance. If you think of the top cm layer, an average 235 W/m2 passes through as SW. This does not interact with the temperature gradient (as a flux). Almost all of it has to pass back to the atmosphere. In this cm layer, it is conveyed by conduction or turbulent advection. Both cases require a temperature gradient.

It’s true that there will be times when the atmosphere is unusually warm and the surface flow is reversed. My main point is that if the flow can go one way without huge temp differences, it can go the other.

Yes, they are old fuddy duddies, who cant get with the free and easy, make it up as you go along, climate scientists.
These people are so misguided as when they see the blog title “Science” they assume that a word like heat will be used properly.
The general public don’t know any better than to describe the weight of a bag of sugar as 2Kilograms.
Should we follow their example?
What about sticking to phlogiston, we all know what they mean, don’t we?

scienceofdoom, a very well researched article. And Nick Stokes, your piece is interesting as well.

I approach this as someone who surfed and dived (both day and nigh) for a couple of decades in the deep tropical ocean.

Now, the human body is a marvelous instrument, capable of detecting temperature differences. So I am very familiar with the thermal structure of the upper ocean, but not as curves on a graph. I know its structure because I have spent a lot of time there.

Nick, you are right that there is a “hook” in the temperature curve. The surface is radiating and evaporating, so perforce it is cooler than the immediate substratum.

However, that “hook” is a slight drop from the higher temperature that is found just below the surface. And that temperature is the temperature that is affected by the DLR.

Yes, some of the energy in the immediate subsurface moves downwards. It does this by turbulence. However, that downwards motion through turbulence is opposed by the thermal stratification of the ocean (daytime) and also by the thermal circulation of the ocean (nighttime).

As a result, a larger percentage of the DLR energy that is absorbed just below the surface gets re-radiated or evaporated than is the case with solar energy which is absorbed at depth.

How much more? Aye, there’s the rub … As you know, the ocean is the opposite of the atmosphere. The atmosphere stratifies at night, and overturns vertically during the day.

The ocean, on the other hand, thermally stratifies during the day, and overturns at night. This is also the time when both radiation sources are active. The solar radiation heats the bulk, and the DLR heats the skin. Both contribute to the increased loss due to convection, conduction, and evaporation.

During the day, the DLR pushes the peak surface temperature way higher than it would be if it were just heated by the sun. It is quite perceptible when swimming in the tropical ocean on a warm day with no wind. Sometimes I’ve been swimming in very warm water, and when my arm goes down vertical the warm water stops at about my forearm. Lie still in that water, and you can feel how warm the very surface is driven by the combination of DLR and solar energy.

At night, the surface cools below the temperature of the underlying layers. This soon starts vertical thermal circulation, and we get an entirely different situation. Slow upwelling begins over large areas, with interspersed smaller areas of faster-descending columns of cooler water.

Of course there is no sun at night. All of the DLR which is entering the ocean is on a one way circulation path. It is absorbed immediately below the surface. It then moves upward (not downward) and radiates, conducts, and evaporates its energy upwards. Thus cooled, the water then sinks in the descending column.

So at night, one effect of the DLR is to slow the vertical circulation of the ocean.

Is there an overall conclusion? My only conclusion is that more of the DLR energy ends up quickly re-emitted upwards (whether by conduction, convection, or evaporation) than happens with the corresponding solar energy. Particularly in clear tropical waters, much of that energy enters the ocean at tens of metres of depth. A much smaller percentage of the solar energy is quickly re-emitted back upwards into the atmosphere or to space.

Willis,
Thanks for your comments here. My interest in the profiles is not so much the temperatures themselves, but what they say about fluxes. The “hook”, or maximum, must be associated with a heat source, which is the absorbed sunlight. That’s why it has to go away at night. We know it’s a source, because heat flows away in both directions.

I think it’s interesting that the temperature variations, though not huge, are in the range of 1-2C, which I’m sure could be felt, as you say.

There’s always a chicken-egg aspect to reasoning about heat fluxes and temperatures. Temperature gradients induce fluxes, but heat is conserved, so fluxes have to be continuous, and so force temperature gradients.

But at surfaces, you often need to think first about temperatures. A downward IR flux, say, does not of itself force surface evaporation, or indeed upward IR emission. Those depend on surface temperature. The temperature has to settle at a value which will bring the fluxes into balance. That’s why there is that steep gradient drop near the surface. It doesn’t look steep in the diagram, but it is if you allow for the compressed depth scale. The steepness reflects the larger thermal resistance in the top mm and the 235 W/m2 flux that must pass through (because of insolation). The gradient would be steeper, and the surface colder, if it were not for the downwelling IR.

I know that’s not all strictly in response to your comment, but you started me thinking about it.

“Heat is defined as any spontaneous flow of energy from one object to another, caused by a difference in temperature between the two objects.”

Correct, but you missed out the part that

Heat always flows from the higher temperature object to the lower temperature object.
……………………………………………………………………
Work and thermal energy.

When various examples of Heat Engines are given in textbooks they ALWAYS show the engine as working between two reservoirs.

Heat is always taken from the higher reservoir some work is done and the unused heat transferred to the lower temperature reservoir.

Nobody has tried the insane idea of the engine working by taking thermal energy from the lower temperature reservoir,doing some work, and then transferring the unused heat to the higher temperature resoirvior.

“Heat is always taken from the higher reservoir some work is done and the unused heat transferred to the lower temperature reservoir. ”

Yes… and if you increase the “heat” (pressure whatever) of the exhaust reservoir, it effects the amount of work your engine can do… If you have an engine exhausting into a perfect vacuum, your engine will run A LOT more efficiently than at 1bar 300k… you seem to be implying, that it would make no difference?

Heat always flows from the higher temperature object to the lower temperature object.

It’s true because the net flow of heat is always from the hotter to the colder.

Strictly speaking that is not the definition of heat. That is a characteristic of heat.

Nobody has tried the insane idea of the engine working by taking thermal energy from the lower temperature reservoir,doing some work, and then transferring the unused heat to the higher temperature resoirvior.

It would be insane to try that because it doesn’t work.

Over in the article Amazing Things we Find in Textbooks – The Real Second Law of Thermodynamics you have the opportunity to state what you think about the specific point of disagreement – rather than random stuff that everyone does agree on. No doubt you will avoid the real question – does a hotter body absorb radiation from a colder body? You will avoid it because the experts in the field say “yes it does” and you say “no it doesn’t”..

I’m probably a bit of Devil’s advocate here, but according to the definition:

Heat is defined as any spontaneous flow of energy from one object to another, caused by a difference in temperature between the two objects.

Back radiation doesn’t heat the ocean, as the energy transfer is not caused by a difference in temperature.

I tend to agree with what Leonard said under the other post, that it is probably better to avoid the use of “heat” in such a context. While I understand what you mean with it here, but it can be somewhat misleading and thus a pointless complication for understanding the subject.

To sum up – I think it would be better to avoid saying that back radiation is “heating” the ocean.

Ive got a random Q… ok so with water being totally opaque to LW, it is ignored as a transfer mechanism through a body of water… im assuming because LTE would make it irrelevant, now with turbulent mixing, basically working to greatly increase the surface area of say a warmer flow with a colder body of water, is the subsequent enhanced energy transfer partly due to radiative transfer as well as conduction in a fluid?

Mike,
you can’t really distinguish between radiative transfer and conduction within a near-opaque medium. With high opacity radiative transfer obeys a similar equation – the Rosseland regime. Any experimental measurement of conductivity will include it.

But if you kinow the absorption coefficient you can calculate it the IR component of transfer. It will probably be very small.

Mike, No, it’s not a silly question – for more transparent substances it would be a real issue. It’s just that for something as opaque as water on a macroscopic scale, the IR component of internal transfer can’t be detected externally.