Ocean heat content revisions

Hot on the heels of last months reporting of a discrepancy in the ocean surface temperatures, a new paper in Nature (by Domingues et al, 2008) reports on the revisions of the ocean heat content (OHC) data – a correction required because of other discrepancies in measuring systems found last year.

Before we get to the punchline though, it’s worth going over the saga of the OHC trends in the literature over the last 8 years. In 2001, Syd Levitus and colleagues first published their collation of ocean heat content trends since 1950 based on archives of millions of profiles taken by oceanographic researchers over the last 50 years. This showed a long term upward trend up, but with some very significant decadal variability – particularly in the 1970s and 1980s. This long term trend was in reasonable agreement with model predictions, but the decadal variability was much larger in the observations.

As in all cases where there is a data-model mismatch, people go back to both in order to see what might be wrong. One of the first suggestions was that since the spatial sampling became much coarser in the early part of the record, there might be more noise earlier on that didn’t actually reflect a real ocean-wide signal. Sub-sampling the ocean models at the same sampling density as the real observations did increase the decadal variability in the diagnostic but it didn’t provide a significantly better match (AchutaRao et al, 2006).

Other problems came up when trying to tally the reasons for sea level rise (SLR) over that 50 year period. Global SLR is a product of (in rough order of importance) ocean warming, land ice melting, groundwater extraction/dam building, and remnant glacial isostatic adjustment (the ocean basins are still slowly adjusting to the end of the last ice age). The numbers from tide gauges (and later, satellites) were higher than what you got by estimating each of those terms separately. (Note that the difference is mainly due to the early part of the record – more recent trends do fit pretty well). There were enough uncertainties in the various components so that it wasn’t obvious where the problems were though.

Since 2003, the Argo program has seeded the oceans with autonomous floats which move up and down the water column and periodically send their data back for analysis. This has at last dealt with the spatial sampling issue (at least for the upper 700 meters in the ocean – greater depths remain relatively obscure). Initial results from the Argo data seemed to indicate that the ocean cooled quite dramatically from 2003 to 2005 (in strong contradiction to the sea level rise which had continued) (Lyman et al, 2006). But comparisons with other sources of data suggested that this was only seen with the Argo floats themselves. Thus when an error in the instruments was reported in 2007, things seemed to fit again.

In the meantime however, calibrations of the other sources of data against each other were showing some serious discrepancies as well. Ocean temperatures at depth are traditionally made with CTDs (a probe that you lower on line that provides a continuous temperature and salinity profile), Nansen bottles (water samples that are collected from specified depths) or XBTs (eXpendable bathy-thermographs) which are basically just thrown overboard. CTDs are used over and again and can be calibrated continuously to make sure their pressure and temperature measurements are accurate, but XBTs are free falling and the depths from which they are reporting temperatures needs to be estimated from the manufacturers fall rate calculations. As the mix of CTDs, bottles, XBTs and floats has changed over time, minor differences in the bias of each methodology can end up influencing the trends.

(If this is all starting to sound very familiar to those who looked into the surface stations or sea surface temperature record issues, it is because it is the same problem. Almost all long historical climate records were not collected with the goal of climate in mind.)

In particular, analysis (or here) of the XBT data showed that it was biased warm compared to the CTDs, and that this bias changed over time, and was dependent on the kind of XBT used (deep versus shallow). Issues with the fall rate calculation were well known, but corrections were not necessarily being applied appropriately or uniformly and in some cases were not correct themselves. The importance of doing the corrections properly has been subject to some ongoing debate (for instance, contrast the presentations of Levitus and Gourteski at this meeting earlier this year).

So where are we now? The Domingues et al paper that came out yesterday, along with a companion paper from essentially the same group (in press at Journal of Climate) have updated the XBT corrections and dealt with the Argo issues, and….

… show a significant difference from earlier analyses (the new analysis is the black line). In particular, the difficult-to-explain ‘hump’ in the 1970s has gone (being due to the increase in warm-biased XBTs at that time). The long term trend is slightly higher, while the more recent trends are slightly lower. Interestingly, while there still decadal variability, it is much more obviously tied to volcanic eruptions than was previously the case. Note that this is a 3-year smooth, so the data actually goes to the end of 2004.

So what does this all mean? The first issue is tied to sea level rise. The larger long term trend in ocean warming reported here makes it much easier to reconcile the sea level estimates from thermal expansion with the actual rises. Those estimates do now match. But remember that the second big issue with ocean heat content trends is that they largely reflect the planetary radiative imbalance. This imbalance is also diagnosed in climate models and therefore the comparison serves as an independent check on their overall consistency. Domingues et al show some comparisons with the IPCC AR4 models in their paper. Firstly, they note that OHC trends in the models that didn’t use volcanic forcings are consistently higher than the observations. This makes sense of course because each big eruption cools the ocean significantly. For the models that did include volcanic forcings (including the model we used in Hansen et al, 2005, GISS-ER), the match is much better:

(Note that the 3-year smoothed observations are being compared to annual data from the models, the lines have been cut off at 1999, and everything is an anomaly relative to 1961). In particular, the long term (post 1970) observational trends are now a better match to the models, and the response to volcanoes is seen clearly in both. The recent trends are a little lower than reported previously, but are still within the envelope of the model ensemble. One interesting discrepancy is noted however – the models have a slight tendency to mix down the heat more evenly than in the observations.

This isn’t going to be the last word on OHC trends, and different groups are going to be publishing their own versions of this analyses relatively soon and updates to the most recent years are still forthcoming. But the big picture is that ocean heat content has indeed been increasing in recent decades, just like the models said it should.

165 Responses to “Ocean heat content revisions”

Gavin – It is interesting – at least from eyeballing the final figure, we’d need to see some statistics to be certain – how the individual model ensemble runs match not just the magnitude of OHC variability since the 1970s but the timing of the variability so well. That seems to suggest the variability is driven more by external forcing (the volcanoes, as you suggest) than internal variability. Otherwise, there’s no reason to expect the ensembles to match up the timing of the swings in OHC. If correct, that serves as yet another reminder that one needs more than a two or three decades of data to properly estimate decadal climate variability.

Thanks for the very clear discussion and update – it is greatly appreciated.

RE#1 Keep in mind that any quasi-periodic oscillations in the climate system (such as the best understood, the El Nino / Southern Osciallation) will themselves be affected by a warming ocean. Also, even long datasets like the 20th century El Nino / La Nina record are of no use in predicting the onset and strength of the next El Nino.

Similarly, the increase in global ocean heat content is probably going to have some effects on the global ocean circulation – effects that are also very difficult to predict. A good example there is Southern Ocean, where indications seem to be that the ocean may start warming a bit faster:

ScienceDaily (Dec. 5, 2006) — The Southern Ocean may slow the rate of global warming by absorbing significantly more heat and carbon dioxide than previously thought, according to new research.

The Southern Hemisphere westerly winds have moved southward in the last 30 years. A new climate model predicts that as the winds shift south, they can do a better job of transferring heat and carbon dioxide from the surface waters surrounding Antarctica into the deeper, colder waters.

However, what that means is that land and atmospheric temperatures might be a bit lower, but it also means that the ocean will warm more rapidly, possibly increasing the thinning of ice shelves and sea ice.

Great snakes! In the absence of significant volcanic eruptions the trend line eventually goes up FAST. It seems the models at least are telling us that, once the aerosol lid comes off, the GHGs lurking in the background resume pushing the systems hard. Looks really bad for us if we don’t get any significant eruptions in the next decode or two. Yes, I’ve already given up any hope of CO2 reductions. Thus it seems we are now at the mercy of Nature.

“We add our observational estimate of upper-ocean thermal expansion to other contributions to sea-level rise and find that the sum of contributions from 1961 to 2003 is about 1.5 0.4 mm yr-1, in good agreement with our updated estimate of near-global mean sea-level rise.”

Good post, interesting info. Just wanted to let you know, after a long and much-appreciated absence, I am sure, that “polar cities” have been given a second nickname, that is “Lovelock Cities” — in honor of the great man of England, Dr James Lovelock. He has seen the images of these blueprints and approved, saying in an email: “Thanks for showing me those images, Danny. It may very well happen and soon”.

Of course, Dr Lovelock says it might happen in 20 to 40 years, by 2050 at the latest. I am a bit younger than him, so I am still saying it won’t happen for another 500 years, but that it’s time to start thinking about these ideas now.

If Ocean temperatures are in fact rising, and what you are seeing is not just natural long-term variability which is a real possibility, at what point do you think the speed and duration of off-shore winds (before dawn for example), might be measurably increased? If average off-shore (from land to water) wind intensities actually were also rising and could be charted perhaps this would be a way to reinforce the implications inherent in the data and the theory?

Is this new analysis likely to help resolve the ‘missing ocean heat’ puzzle that was reported a few months ago. And if so what other significant discrepancies between the models and data remain to be solved?

That’s not true if you define “of late” as the last 4 years. As of right now, global sea level has been declining for 2 years. Also noted is the fact the OHC paper discussed appears to stop at 2003. Much interesting stuff has happened post 2003.

I find it quite curious, how all of our measuring devices that are discovered to have some kind of bias, tend to show cooling compared to model predictions and need corrections towards warming. None is discovered with a bias towards too much warming that needs correction. It’s like if the devices didn’t want us to believe in global warming. Radiosondes, MSU satellites, inlets and buckets, now the devices of the Argos project too…

[Response: Your curiosity appears quite limited. The pre-war bucket corrections reduced the trend, UHI corrections reduce the trend, the XBT corrections are basically neutral in the long term, revisions to various the North Atlantic THC time series reduced the trend, the stratospheric trend in the radiosondes was reduced, …. etc – gavin]

I recently read about NASA people being surprised about the accuracy of their satellites for measuring sea-level change. They said it happened to be more than 10 times better than expected, because they showed the “right” sea-level increase. But I fail to understand how a measuring device can get more accurate results than predicted given its physical capabilities and limitations.

Sorry, what NASA says is that TOPEX/POSEIDON happened to give results three times more precise than expected regarding sea-level, not ten. They expected an accuracy of 5.9 inches and have found it to be 1.8 inches.

Is there a hypothalmus in the earth’s biosystem that serves to regulate
and reset optimal temperature, to make a comparison system function that
is found in humans.

Logically, it sounds like the ocean, since people do mention
the heat releases from the ocean in the way of regulation of temperature.

However, to myself, that sounds more like an auxillary system turn on feature
to regulate an ideal temperature.

For example, people shiver in reaction to the body, muscles, trying to equalize
the thermal differences it is observing in reaction to fevers or the like. People
evaporate excess heat through their heads, hands, feet, sweating, and blah..blah..

Another words, what and where is the order giver, like the hypothalmus, in
our biosystem?

9 Danny Bloom: Lovelock is right, you are wrong. In 500 years, if we haven’t gotten GW under control this century, we will be extinct. Forget about polar cities, we don’t have time to build them before going extinct.
Environmental policy = energy policy
Energy policy = environmental policy
because Global Warming
can lead to Hydrogen Sulfide gas coming out of the oceans.

“EARTH SCIENCE
Impact from the Deep
Strangling heat and gases emanating from the earth and sea, not
asteroids, most likely caused several ancient mass extinctions.
Could the same killer-greenhouse conditions build once again?
By Peter D. Ward
downloaded from:http://www.sciam.com/article.cfm?articleID=00037A5D-A938-150E-A93883414B7F0000&sc=I100322
………………..Most of the article omitted………………….
But with atmospheric carbon climbing at an annual rate of 2 ppm
and expected to accelerate to 3 ppm, levels could approach 900
ppm by the end of the next century, and conditions that bring
about the beginnings of ocean anoxia may be in place. How soon
after that could there be a new greenhouse extinction? That is
something our society should never find out.”

Press Release
Pennsylvania State University
FOR IMMEDIATE RELEASE
Monday, Nov. 3, 2003
downloaded from:http://www.geosociety.org/meetings/2003/prPennStateKump.htm
“In the end-Permian, as the levels of atmospheric oxygen fell and
the levels of hydrogen sulfide and carbon dioxide rose, the upper
levels of the oceans could have become rich in hydrogen sulfide
catastrophically. This would kill most of the oceanic plants and
animals. The hydrogen sulfide dispersing in the atmosphere would
kill most terrestrial life.”

“Under a Green Sky” by Peter D. Ward, Ph.D., 2007.
Paleontologist discusses mass extinctions of the past and the one
we are doing to ourselves.

ALL COAL FIRED POWER PLANTS MUST BE
CONVERTED TO NUCLEAR IMMEDIATELY TO AVOID
THE EXTINCTION OF US HUMANS. 32 countries have
nuclear power plants. Only 9 have the bomb. The top 3
producers of CO2 all have nuclear power plants, coal fired power
plants and nuclear bombs. They are the USA, China and India.
Reducing CO2 production by 90% by 2050 requires drastic action
in the USA, China and India. King Coal has to be demoted to a
commoner. Coal must be left in the earth. If you own any coal
stock, NOW is the time to dump it, regardless of loss, because it
will soon be worthless.
I have no financial connection to the nuclear power industry.

I apologize if I didn’t understand but I thought that there was also an OHC “problem” between 2003 and 2006 or even 2007.
In this last period, I read from Willis (NOAA) that there was a difference between altimetric sea level (Jason) and the sum of mass (GRACE) and steric (ARGO) “sea level”.
Please could you rectify me?

Gavin: Haven’t you noted the impacts of ENSO on OHC in prior posts? It seems to me that there would be a better correlation between those two than volcanoes and OHC. I know I’ve seen a graph comparing OHC and ENSO, but right now I can’t find it in the realclimate archives. Could you post one, please?

Regards.

[Response: I’ve speculated on this previously, but I don’t have any hard numbers nor figures. I suspect that the reason why volcanoes show up more strongly in the ocean-wide analysis is because they have a global effect. Impacts of ENSO on OHC are likely to be more regional – implying perhaps a more ambiguous global signal. Perhaps the authors of the latest study could be persuaded to do the ENSO-OHC regression? – gavin]

[Response: I think there is some mixing of time scales here. For the short period data the error bars are larger than for the the full satellite period, and if you want to close the budget for just the most recent 4 years period the uncertainties in each individual term are very significant. It’s not obvious to me that this is well constrained at all. The Domingues et al paper closes the budget over a 40 year period where the uncertainties in trends are much less. – gavin]

1) In pre-history humans survived even Tambora’s impacts and ice ages (Humans are thought to have originated 200k years ago).

2) Humans have lived for generations before modern technology on every continent, apart from Antarctica. And in harsh environments from the Kalahari to the Arctic circle. We are mobile and our intelligence and language give us the ability to adapt on a sub-generational timescale.

Gavin,
Thanks for the write-up.
Get ready for the wave of corrections and retractions from the denialist lobby. ;)

Extinction isn’t the only bad outcome from a calamitous warming. Wars over such elemental things as food and water menace many 3rd world countries. (And we’re mired in endless war in the MidEast over oil.) The hundreds of millions threatened with famine and increased spread of disease won’t feel so cavalier about the costs.

[Response: I imagine it’s probably the actual paper, and I’d point out that this concludes there is a problem with one or more of Argo/Jason/GRACE over a four year period (2003-2007), and isn’t relevant to the Domingues paper which is about the attribution over a much longer period and does not involve any of those observing systems. – gavin]

Interesting that the volcanic influence seems to follow the source by 7 years. I see this data and I get curious about the data sets used in the study when I see TOA/Triton graphs such as these…

As Dr, Schmidt would likely suggest the NOAA data is probably a regional phenomena and not global. Though, when I look at the data at this site ( http://www.pmel.noaa.gov/pirata/display.html ) I see a similar lack of an increasing heat trend. If this is true and there is still supposed to be a heat content increase, caused by thermal expansion, it makes me wonder why the apparent conflict in the data posted by NOAA and the study.

I hope that the most recent satellite sent up (Jason 2) should help resolve most of the issues. I suspect the observations by Grace may be regional and am hoping the observations by the OSTM/Jason 2 mission package will provide a deep space global overview.

(Though, most of the descriptions I have read of Jason 2 mission suggests that the observation window is also regional. Too bad, a long term global/atmospheric dimension measure coupled with the change in the refraction/deflection of solar light by the atmospheric or the top 100 meters ocean water, at the edge of the terrestrial sphere might offer an interesting insight when coupled with the observations of SeaWiffs…)

Semi–OT (and as mentioned by l david cooke): I am very thankful that our governments can work together in the form of international science projects. I see this as a hopeful sign. I wish the best for Jason-2 and all the folks that continue to work so hard on it (and all of us that have helped pay for it). More data here is very useful.

I have to say that I agree with Edward. But since it is impossible then we are all doomed :-(

The problem is that no-one will face up to that possibility, so that when it becomes obvious that that is the way we are heading, it will be too late! Everyone is mocking Edward Greisch, but what hope have we of cutting back on our fossil fuel consumption when even a small rise in the price of oil has sparked strikes and riots.

There is no way that we are goint to take the action needed to prevent the catastrophe that will end civilisation.

I agree that we urgently need phase out the burning of coal to generate electricity as quickly as possible.

Nuclear power plants cannot be built “immediately”. It is simply impossible to build nuclear power plants fast enough to significantly reduce CO2 emissions from coal-fired electricity generation within the time frame that this needs to occur.

Conservation and efficiency improvements, on the other hand, can be implemented almost immediately. When electricity prices skyrocketed in California a few years ago, conservation measures effectively reduced electricity consumption on a time scale of weeks.

Solar and wind-generated electricity generation can be brought online much faster than nuclear power, and the USA has sufficient solar and wind energy resources to produce far more electricity than the country currently uses. And there are other technologies available: recovering waste heat from industrial processes and using it to generate electricity could produce more electricity than all the nuclear power plants in the USA.

Alastair McDonald wrote: “… since it is impossible then we are all doomed … There is no way that we are going to take the action needed to prevent the catastrophe that will end civilisation.”

It is not “impossible”. Full exploitation of available wind and solar energy resources, combined with maximizing efficiency, “green building” technologies, electrification of transport (electric rail and electric cars), supplemented with sustainably produced biofuels where appropriate, all using existing technologies, can relatively easily accomplish the transition to a near-zero carbon energy economy within the necessary time frame.

Whether we WILL do what we most certainly CAN do, is another story. The barriers are not technological nor economic, they are political. And in that regard, I am also pessimistic.

But sinking hundreds of billions of dollars into boondoggles like nuclear power plants and “clean coal” is going to accomplish nothing, and worse, will squander and waste both time and financial resources that would be far more effectively spent on clean renewables and efficiency.

Is it now fair to say that the real continuing “problem” with Argo/Jason/GRACE is that there is misplaced or overstated reliance upon altimeter derived measurements as a proxy in support of heat in the upper ocean when there may be a case for thermosteric expansion of the ocean deep producing the same or similar consequent sea level rise? If so, was the use of altimeter data for comparative purposes in finding “spurious cooling” in the Argo data appropriate in the first instance given such expansion of the deep?

Dr. Willis himself says in
Willis et al. (2008), **In situ data biases and recent ocean heat content variability. Journal of Atmospheric and Oceanic Technology (in revision)**,
that ARGO data still show “no significant warming or cooling is observed in upper-ocean heat content between 2004 and 2006”, and in
Willis et al. (2008), **Assessing the globally averaged sea level budget on seasonal to interannual timescales**, he further says:
“First, from 2004 to the present, steric contributions to sea level rise appear to have been negligible… Although the historical record suggests that multiyear periods of little warming (or even cooling) are not unusual, the present analysis confirms this result with unprecedented accuracy.”

Lastly, I am not aware of any comment from Dr. Hansen about the Domingues et al. 2008 paper. I understood from**Earth’s Big Heat Bucket** (at http://earthobservatory.nasa.gov/Study/HeatBucket/ )
that Dr. Hansen looked to the ocean and Willis for the “smoking gun” of earth’s energy imbalance caused by greenhouse gases. Do you know if Dr. Hansen is ready to herald the “smoking gun” based upon Domingues et al. 2008?

Is there a reason why volcano-induced cooling appears to be larger in magnitude in the models than in the observations? (Especially in the GISS models, it seems to me, although it’s hard to pick out individual models in that graph.) Perhaps it’s due to the extra smoothing of the data, but I don’t know if that explains all of it. I think I’ve noticed this in surface temperature data-model comparisons as well. Are the models over-sensitive to volcanic forcing?

[Response: The smoothing is part of it, but there is also some uncertainty in the stratospheric aerosols. The overall fit to radiative perturbations and temperature changes is pretty good though (see Hansen et al, 2007). – gavin]

Re: #32:“It is not “impossible”. Full exploitation of available wind and solar energy resources, combined with maximizing efficiency, “green building” technologies, electrification of transport (electric rail and electric cars), supplemented with sustainably produced biofuels where appropriate, all using existing technologies, can relatively easily accomplish the transition to a near-zero carbon energy economy within the necessary time frame.”

As you indicate, much of the problem is political, not technological. But it is extremely complicated.

China and India, China in particular, is heavily involved in the construction of coal powered electric plants that will be sending large amounts of CO2 into the atmosphere for the next 40 years.

The efforts of these developing countries may overpower the efforts of the United States even if the U.S. did what it should in the conservation efforts and in the efforts to use solar, wind, nuclear, and other power sources that will not produce CO2.

And I do not see within the U.S. the political climate to allow us to do nearly enough in this direction. I see some movement in this direction, but it will not be sufficient to keep the worst case situations from happening.

I am very pessimistic about the long term future. But fortunately for me my age (67) will likely keep me from seeing the worst of it. It is my grandchildren that I worry about.

The differences are all on the other side of the turbine house, where coal has boiler houses, coal handling / milling, flue gas handling, ash handling and disposal. It’s far from immediately obvious that you couldn’t just slot in a PWR containment building and hook up the existing steam lines to the heat exchanger.

“Impossible”, no. Economic, well, that’s unlikely, but there might be a few cases where it could work.

Uranium is mined and has a “peak uranium” issue, just like “peak oil”.

It can buy 10-20 years of time, but it isn’t a solution. Reactors are also ridiculously expensive and time consuming to construct, even if we decided to drop everything and build them. Without massive subsidies they aren’t going to get built, which is a drag on the economy.

It is political and embedded. China and India are the not so new ditch diggers of corporate America.

Assuming the disaster scenario is as significant as advertised. The only answer is to treat global warming with same concern as nuclear threat.

The problem is everything we eat, use and buy depends on oil and China. There is not a single thing in your supermarket that did not get there without oil, organic or otherwise. Almost everything in your house, what you wear, what use, etc…, is made in China and most likely out of oil.

America is a drug addict whose system cannot survive without the drug (oil).

Cut-off the oil and break off trade with China and the US will instantly lose its power on the world stage and some other country will take its place. And then, who would stop them?

The question is no longer can it be stopped, but what is the best strategy to survive it?

Nanosolar just announced that their new printing press for printing solar voltaic thin film runs at 100 feet (30m) per minute and can theoretically be pushed to 2000 feet (666m) per minute. At the slower speed, this means that a single press is able to pump out 1 Gigawatt of generation capacity per year. The total US generation capacity is 1000 GW. So at the slower speed 100 printers could produce enough panels in 10 years to replace the current total US generation capacity. At the faster speed 50 printers would do it in a year.

They say the printers will cost $1.6 million each.

They don’t say what it costs to run them, or what the electricity cost will be once you factor in all manufacturing costs, framing and installation. But clearly it will drop the cost of solar generated electricity significantly.

If this reduces the cost of solar below the cost of coal fired electricity, then it seems inevitable that we will see a rapid switch to solar.

(Of course the various energy storage technologies available and in development would need to be ramped up also, possibly along with adjustments to energy usage patterns and the installation of direct current transmission lines. But this is all doable.)

There are many things that technology can’t solve, but our emissions problem clearly can be, in spite of the clear preference of the political classes for dumb, destructive, dirty technologies.

I had the impression that looking at ocean heat content rather than tropospheric temperatures has the advantage that the signal is less variable, because of the ocean’s large heat capacity and resulting lag time in response (and a smoothing of the response). Therefore I wouldn’t have expected volcanic eruptions to leave a temperature signal in the ocean. Am I missing something?

Am I reading the first chart correctly: The heat content of the oceans has risen 10 fold in the last 40 years? How is that possible without a huge rise in either mass or temperature (neither of which seems to have happened)?

[Response: The graph is of the anomaly – not the absolute amount. The total energy in the oceans (compared to water at 0ºC) is vastly greater (mass of the ocean 1.4×1021 kg x average temperature (maybe 5ºC) x specific heat (~4000 J/kg/C) = ~ 2.8 x1025 J). – gavin]

The volcanic eruptions that have this effect were tropical ‘Plinean’ eruptions. These eruptions (named after Pliny the Younger’s account of Vesuvius) eject sulphate aerosols and ash into the stratosphere, where it reflects sunlight back into space. This has a fundamental impact on the global energy budget, causing a cooling due to reduced incoming sunlight. So it would be expected to appear. The ocean integrates out (smooths) much of the weather that impacts surface/atmospheric temperatures.

There are 3 factors governing global planatery temperature at it’s most basic level,

Forgot to add…
The observed response is mainly damped by shorter mixing times of upper ocean layers (not to be confused with centennial overturning). The oceans are in 2 way flux of energy with atmosphere, reduce the incoming shortwave incident upon the surface and you get cooling, the response of the upper wind-mixed layer is quite rapid.

Willis et al.,J. Geophys. Res., 113, C06015, doi:10.1029/2007JC004517, report oceanic enthalpic changes beyond the period covered by Domingues et al. They find little or no change in heat content for the upper ocean from about 2003 to now. I remain unconvinced of the efficacy of the current generation of models.

[Response: Why might that be? The Willis paper clearly states that there are a) unresolved issues with trends in one or more of the data sets they are looking at, and b) that short term variability in trends is to be expected both from past data and as seen in the models. This should be contrasted for the much longer time frame considered in Domingues et al (where issues of short term variability are much less important and trends much more defined). Please explain your reasoning as to why the latter study apparently weighs less in your deliberation. – gavin]

About the storage, I did myself a little calculation for pumped hydro in the Finnish situation.

At 1 km depth, a cubic m of water represents 10^7 J of energy. 1000 m^3/s thus represents 10 GW of power, about what a Finland sized country needs.

Over a day, this requires a storage volume of 10^8 m^3. Compare this to the volume of the projected Helsinki-Tallinn rail tunnel, 10^7 m^3. A facility could even be built in connection with this project. Bedrock excavation with dynamite is relatively inexpensive, and the rail can be used to get the rubble out…

How do we maintain +3ppm of CO2 per year for two hundred years, getting to 900ppm when we know oil and gas reserves are already approximately half gone so their extraction rate and therefore annual CO2 contribution will be falling soon. Likewise coal reserves aren’t good for 200 years of today’s extraction rates, more likely a peak within a few decades. There certainly aren’t the fossil fuel reserves to allow the current mechanism of CO2 accumulation to continue.

[Response: The graph is of the anomaly – not the absolute amount. The total energy in the oceans (compared to water at 0ºC) is vastly greater (mass of the ocean 1.4×1021 kg x average temperature (maybe 5ºC) x specific heat (~4000 J/kg/C) = ~ 2.8 x1025 J). – gavin]

Gavin!!! That 5 C should be 278 K and the final figure should be 1.5 x 1027 Joules!!!

[Response: Not really. I gave the baseline I was using which is a common standard. Using absolute zero doesn’t give a good idea of what energy is actually usable. – gavin]

Oil, gas and coal are cheap, have over 100 years of know how and energy infrastructure invested in them and hence they entrench the mind. Politically and economically they dominate to as they are cost effective and readily available globally. The USA also gets a lot of economic and political leverage from oil in the form of petro dollars allowing the USA to print a lot of money to keep this system going and getting a free lunch on the bacl of investments in dollars.

Therefore the whole idea of replacing it with something else or indeed in just meeting future demand (7 TW) with alternatives is only gaining credence in regard to oil and gas as they are expensive at the moment and demand is rising. Coal on the other hand is locally available (china, Russia, Europe and the USA have plentiful reserves) and hence offers some energy security in an uncertain world.

Pioneering alternatives and a new economical and political landscape is hard work for politicians and will be a long time in coming although the landscape is moving slightly in regard to electricity production and transport we have not even really got a strategy, just some ideas at the moment.

Even if the USA hybridised every car in the USA today, in 7 years time we would still need the same amount of oil as we use today due to economic growth. 2 to 3% per annum is enough to double energy demand in 30 years.