Climate Science Glossary

Term Lookup

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Term:

Settings

Beginner Intermediate Advanced No DefinitionsDefinition Life:

All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Posted on 6 February 2014 by dana1981

According to the global surface temperate data set compiled by Kevin Cowtan & Robert Way, which achieves the best coverage of the rapidly-warming Arctic by filling in data gaps between temperature stations using a statistical method called kriging, 2013 was the 5th-hottest year on record (since 1850). The top three hottest years (2010, 2005, and 2007) were influenced by El Niño events, which cause short-term warming of the Earth's atmosphere.

Over the past decade, we've seen less warming at the surface and more warming in the oceans. This has been in large part due to a change in Pacific Ocean cycles. We're currently in a cycle that tends to produce more La Niña than El Niño events, which has resulted in the oceans accumulating more heat, leaving less energy than normal to warm the atmosphere. This in turn has led to the widespread myth that the slowed rate of increase of global surface temperatures means we no longer have to worry about global warming, or that its consequences won't be as bad as expected.

The fundamental flaw in this argument is that it neglects a key fact: cycles are cyclical. In the '80s and '90s when the Pacific Ocean was in the previous phase of this cycle, we saw more El Niño events and more warming of global surface temperatures than the average of climate models projected. However, we can separate out the short-term El Niño and La Niña influences from the human-caused global warming component in the simple manner first suggested by Texas state climatologist John Nielsen-Gammon, shown in this animated graphic:

The El Niño/Neutral/La Niña years here are categorized using a slightly modified approach from the one described in this post last year. In essence, a year with a significant (magnitude larger than 0.3 of the average of the 3 ENSO indices described in that post) surface cooling influence from a La Niña event is put in the La Niña category, ditto for El Niño, and a year with no significant influence is put in the Neutral category. The graphed data begin in 1966 to avoid the effects of the 1963 Mount Agung volcanic eruption.

For each of these three categories, the linear global surface warming trend for 1966–2013 is 0.16°C per decade. That is our long-term underlying global surface warming trend, caused almost entirely by human influences. Note that the colored data points tend to fall close to each of their respective trend lines. This tells us that, for example, an El Niño year today is about 0.6–0.7°C hotter than an El Niño year in the 1970s, and the same is true of Neutral and La Niña years.

What's also interesting is that despite being a Neutral year, 2013 was hotter than 1998, which saw one of the strongest El Niño events on record. This tells us that humans have caused as much global warming over the past 15 years as a powerful El Niño event. The difference is that an El Niño is a temporary event, while human-caused global warming is permanent, unless we can quickly pull a lot of carbon dioxide out of the atmosphere.

Cowtan & Way global surface temperature data, with the powerful El Niño year of 1998 in red and the Neutral year of 2013 in blue.

Due to the current phase of the Pacific Ocean cycles, of the past six years, four have been cooled by La Niñas. Seven of the past 15 years have seen La Niñas, compared to just four El Niños. Conversely, in the previous Pacific Ocean phase, the 1990s saw seven El Niño years to just two La Niñas. Thus natural ocean cycles amplified human-caused global warming at the Earth's surface in the 1990s, but have dampened it since 1999.

It's really interesting to compare the reactions to these two cycles. Today, climate contrarians are arguing that the slowed rise in surface temperatures means global warming is nothing to worry about – they're confusing short-term cycles with something meaningful in the long-term. Just a few years ago, mainstream climate scientists looked at the accelerated rise in global surface temperatures. Rahmstorf et al. (2007) concluded,

"The global mean surface temperature increase ... is 0.33°C for the 16 years since 1990, which is in the upper part of the range projected by the IPCC. Given the relatively short 16-year time period considered, it will be difficult to establish the reasons for this relatively rapid warming, although there are only a few likely possibilities. The first candidate reason is intrinsic variability within the climate system."

Climate scientists didn't panic and decide the short-term acceleration in rising surface temperatures meant that climate models were underestimating global warming, or that it would be worse than expected. They correctly suggested that it was probably just due to short-term noise from natural cycles. On the other hand, climate contrarians have overreacted to and misinterpreted the current short-term noise, incorrectly declaring that it means climate models are overestimating global warming and we have nothing to worry about.

2013 also saw an incredible amount of heat accumulate in the oceans – 2.5 x 1022 Joules, which is equivalent to 390 million Hiroshima atomic bomb detonations, or over 12 atomic bomb detonations per second. 2013 tied 2006 as the year with the most energy accumulating in the oceans since the Argo buoy network began producing much-improved estimates of ocean temperatures to depths of about 2,000 meters. This is significantly higher than the average of about 4 atomic bomb detonations per second over the past decade, and also much higher than 2009. This helps explain why 2009 had higher surface temperatures than 2013 – less heat went into the oceans and more into the atmosphere that year.

I studied Kriging in a 500 level geostatistics university course many years ago. It's a technique developed in South Africa to interpolate gold grades. It's certainly more sophisticated than inverse distance squared. However you would be crazy as an investor to think because you have interpolated your grade using kriging you can drill your exploration holes 1 kilometre apart.

You didn't need to show me the graph for me to guess the C&W warming trend for the arctic was pretty extreme. However, I am surprised at the steep trend in the Antarctic for the last 10 year or 12 years. It looks like by the dark blue line it was warmed about 0.6C in the last 10 years. That seems wrong.

You do have a good cross-check against it however. Due to the elevation of the continent, the TLT trend should be very close to the SAT trend since the TLT sampling zone overlaps the SAT zone. If the C&W warming trend is substantially above the TLT trend for either or both datasets (RSS or UAH) then it is probably wrong. Which begs the question: Why would you use C&W as opposed to the TLT sets for the last 34 years?

After all having actual data should trump filling in gaps with interpolation techniques every time.

I haven't followed this closely, but I thought Loeb 2012 showed slight increase in cloudiness with increased TOA (and hence increased albedo), but that doesnt necessarily mean negative feedback. Dessler and Loeb 2013 show slight positive feedback from cloudiness. Either way, effect appears to be small.

Kapper@52: That assumes temporal homogeneity in the AMSU data, an assumption we are unwilling to make on the basis of the divergenece between UAH, RSS and STAR (also RSS doesn't cover Antarctica). Also you cannot assume that TLT temperatures reflect SATs at altitude because of surface contamination issues.

So I have grave doubts about the validity of your test. However, for what it is worth, here are the trends on 1997/01-2012/12 for 90S-70S:

UAH v5.6 0.581C/decade

CW v2 krig 0.467C/decade

CW v2 hybrid 0.699C/decade

UAH falls almost exactly between these two reconstructions.

Validation against independent data sources provides a far more challenging and informative test of the reconstructions - this is one of the things we are working on at the moment. I'm afraid the backlog of results to write up is rather long though.

You might be right that the 0.6C figure for Antarctica is wrong. According to UAH, Antarctica was on average 1.2C warmer than normal during 2013, so 0.6C for the past 10 years could thus be too low. Good observation.

I can see I've been trumped on the Antarctic. I did I rolling 10, 15 and 20 year trends, which is a way to either optimize cherry picking or avoid it depending on how you use the results but didn't get your number of .58C/decade from UAH. Nor did using your exact time frame give the same number. However since I'm extracting data using the KNMI data explorer I have to use v5.5 not v5.6.

Using v5.5, the numbers are not so close. UAH TLT 70S v5.5 gives only a warming rate of 0.45C/decade 1998 to 2012 inclusive which is below both your C&W trends, and significantly below the "hybrid" version. The rolling 15 year does give some high warming rates in the range of 0.6C/decade, but that was trends ending in 2007 and the warming rate south of 70 has been declining since.

Without access to either C&W datasets (hopefully this will show up on KNMI data explorer) I can't comment on shorter or longer trends comparing TLT to C&W surface, however I certainly can comment on how sensitive 15 year trends are for this area. If I had picked a 15 year trend ending in September 2012 not December, the trend would be 0.27C/decade so I think 15 years for these kind of data are probably not enough.

Which brings up the question of why you chose a 16 year period? Looking at your graph it appears the C&W data have been smoothed but the inflection point appears to be around 2000 for Antarctica for very steep warming

As for your comment on the contamination of the TLT data by the surface, Roy Spencer recently did a post re: the record low Antarctica temperature and noted that actually the microwave emissivity is pretty constant from the surface in Antarctica (and much less of a challenge then areas with sea ice like the Arctic.

In summary, you're not really answering the question of why a dataset which fills big holes is better than a dataset with actual data. Using the UAH v5.5 dataset I find at least the "hybrid" version of C&W appears to run hot for Antarctica.

To further check my trend calculations, I found a link to UAH TLT data v5.6 at Roy Spencers site. However, that only allows me to analyze what Roy has prepackaged in the ascii file which is a column called SoPol, which is -60S not -70. However, comparing 5.6 to 5.5 versions of TLT I get no significant difference. So I don't think there is a big difference between my use of 5.5 and your use of 5.6.

I think maybe a better test comparison for these noisy S70 data is a longer period than 16 years and close the latitude to -75S to capture the place where I think the TLT correlates with SAT the best, over the high plateau. I think you're going to find that the C&W dataset is still running too hot, at least the hybrid version. I know it already runs hot compared to the 30 year global trend for UAH, and I suspect it runs really hot in the places it purports to fill in SAT data holes.

Looks like I was wrong: the 2013 anomaly for Antarctica wasn't 1.2C as I wrote above, it was actually more than 1.4C.

The 2013 Antarctic temp info comes straight from Dr. Spencer:”The warmest areas during the yearwere over the North Pacific and the Antarctic, wheretemperatures for the year averaged more than 1.4 C (morethan 2.5 degrees Fahenheit) warmer than normal.”http://nsstc.uah.edu/climate/2013/december/dec2013GTR.pdf

I sense a bit of trolling from topal, but I'll pretend he is sincere. A lot of people are thinking the way you are, that somehow the ENSO is creating the global warming - that the heat is coming from the planets core through underground vulcanism or something. But this is certainly not the case. As many have pointed to, the oceans absorb more than 90% of the incoming radiation including whatever is re-radiated from CO2 insulation. Oceans have an immense capacity to store heat and the best way to see this is by a simple experiment which you can find several youtube videos on where you put fire under a balloon with some water inside it. Many people still think they are watching a magic trick, while its just physics in action - physics that our brains often have difficulty grasping, just like climate science really.

The planet acts just like any other physical body absorbing and re-radiating heat. If the heat escapes the body completely (into space) then its gone. As long as the body is receiving the same amount of heat as is radiated out, the system is at equilibrum. If you modify the composition of greenhouse gases around this body then some of that heat will be re-radiated into the body again and the temperature of the body will rise gradually until it radiates out the same amount that its getting from both the source (sun) and the greenhouse gas re-radiation. This simple fact is really all you need to know in order to understand where the heat is coming from, where its stored and where it disappears. Any variations on top of this is just noise really. One mecanism of earths ability to expel heat is through Equatorial currents that gives us El Ninõ's. This is often referred as the energy imbalance, and considering that the atmosphere contains some 40% more CO2 compared to pre industrial times there will be an energy imbalance for a very long time and we can expect continued warming no matter what the ENSO does as the CO2 persists and will keep on doing "its job" according to the laws of physics. If anything we have managed to mask out some of the warming by adding aerosols through increased use of coal this past decade, which reflects some of the incoming radiation in the atmosphere. The past decade has certainly seen a lot of the heat going into the Arctic region as we can see from rapid sea ice decline (even in the middle of winter, its now close to its lowest according to Cryosphere) and massive temperature anomalies. C&W study certainly shows that its wrong to ignore this when looking at global temperatures and that the warming is still stronger than ever perfectly following the trend that we know to be true from the physical facts I just talked about.

"In summary, you're not really answering the question of why a dataset which fills big holes is better than a dataset with actual data. Using the UAH v5.5 dataset I find at least the "hybrid" version of C&W appears to run hot for Antarctica."

It is 'likely' that determining a trend from a sampling of the surface temperature that excludes a large percentage of the surface in some regions will be inaccurate. And I am using the term 'likely' in a colloquial manner, not in the 'defined manner' it is used in the IPCC Reports.

Are you trying to find a way to say that human burning of fossil fuels is not a proven concern needing to be acted upon urgently to dramatically reduce such activity by the already fortunate?

Fine-tuning aspects of the investigation into better understanding a larger issue like "the acceptability of continuing the fundamentally unsustainable and clearly damaging pursuit of benefit from the burning of fossil fuels", is important. But many appear to seek opportunity to claim 'uncertainty about that clearly certain matter' by finding a way to raise a question about the minutia to create the impressions of 'significant uncertainty about something there is no uncertainty about'.

The extraction and burning of fossil fuels cannot be continued for very much longer, and humanity has billions of years to look forward to on this amazing planet. And there are many damaging impacts of the activity, including the impacts of the accumulation of excess CO2 (in the atmosphere and the oceans). There is also major harm cause by the conflict between powerful people fighting to get more of the potential benefit for themselves. Burning fossil fuels is an incredibly damaging activity ‘all things considered’.

An acceptable use of an unsustainable and damaging activity would be to address an ‘emergency’. I would accept that ‘emerging’ economies should be allowed to use the burning of fossil fuels to more rapidly transition their entire population into sustainable economic activity. However, this would have to be a brief transient phase. After all, activity relying on burning fossil fuels is ultimately a dead end. Those economic activities simply cannot have sustained growth. And since the objective is to ‘lift the least fortunate into a sustainable better way of living’ the only ones benefiting from the burning of fossil fuels should be those who are the least fortunate. The same goes for any other unsustainable and damaging activity like the use of harmful chemicals or using up (consuming) other non-renewable resources. Everyone already ‘more fortunate’ should be ‘getting by with sustainable virtually damage free ways of living’. That is the only viable future for humanity. Anything else would be unsustainable and unacceptable.

This ‘required development to sustainable activity model’ is challenged by the fact that sustainable activities will always be less profitable and less desired than the more damaging or less sustainable activities that ‘can be gotten away with because of popular support’. The ‘profit motive’ and ‘potential popularity’ clearly cannot be allowed to determine what is acceptable…because they clearly haven’t and won’t.

So I hope you are ‘not of the opinion’ that fine-tuning the data on this small aspect of the larger issue of global warming and climate change alters any of the facts of the larger issue of the unacceptability of burning fossil fuels, or reduces the urgency to develop the most fortunate beyond the ‘popular and profitable in the moment’ unsustainable and damaging activity they have ‘grown fond of getting away with benefiting from’.

The increased understanding among the global population of the unacceptable and significant impacts of excess CO2 is just one of the ways to help raise awareness of the fundamentally unsustainable and damaging ways that many among the most fortunate ‘strive to get away with for as long a they can get away with’. Discussing and debating details needs to be clearly understood to not reduce the urgency of ‘changing the minds, attitudes and actions’ of the population so that humanity actually develops a sustainable better future for all life on this amazing planet.

I can't check UAH TLT v5.6, but I can confirm that S70 UAH TLT v5.5, with or without land mask is nowhere near that kind of anomaly. Against a baseline of 1981 to 2010 the KNMI data explorer export shows 2013 to have an anomaly of +0.6 or 1/2 of Spencer's number.

There is a huge amount of noise in the monthly records, however which means 1 year is not a useful metric. The standard deviation for monthly anomaly data over the last three years is a rather large +/-1.3. Note that in 2013 an all time global cold temperature record was also set in Antarctica, but how meaningful is that?

The longer term for Antarctica is no warming. The TLT record (v5.5) shows 0.02C/decade since the start of the record. Since the standard export of UAH data to text has a column called "SoPol" I can compare the trends between v5.5 and v5.6 for that zone. v5.5 is 0.00C/decade for the last 35 years and v5.6 is -0.01C/decade essentially the same thing.

The data are extremely noisy but keep in mind the above trends are certainly long enough to be statistically significant even so and both versions of UAH TLT show zero warming south of 60S over the last 35 years.

"Are you trying to find a way to say that human burning of fossil fuels is not a proven concern...."

You're deviating off the topic. I'll reiterate an earlier point. People don't plan mines on widely spaced drillholes on the grounds they don't need more drillholes since they have used a sophisticated geostatistical technique called "kriging". Neither should public policy be formed by data with large holes filled by the same technique, on the ground we now "know" how fast it is really warming.

This is a technical issue. It's not like the social issues that divide us. So yes the technical details are important and you're getting way off topic discussing the profit motive, at least on this post, probably on this blog.

Actually, they use kriging methods to determine how closely spaced the drillholes should be. These days, you would of course test for spatial dependence rather than assume it, especially with ore grades.

One Planet. You state acceptability of emerging economies to use fossil fuels and then transition to renewables. However once the costly infrastructure to burn fossil fuels is built, there is a very long period to pay back this infrastructure, usually a minimum of 20 years. We already see with China that the installed capacity of coal fired power stations still increases year on year. The turning point has not been reached and China is now a major polluter.

00

Moderator Response:

[PS] Okay, this is starting to wander off-topic. Perhaps continue the discussion here.

You created the 'opening for my response' with your 'concluding statement about your question regarding filling gaps'.

How you might have deemed such a question to be worth repeating needed to be questioned.

Though this is indeed a site for discussing the details of the science, there seem to be times when the motivation behind persistent questions that have plausibly already been answered needs to be considered.

More 'on topic', are you looking exclusively at the trend of values in the 'gaps' in the HadCRUT4 data coverage of the planet for all of te available ways of looking at trends in the 'gaps'?

If all you are doing is looking at data of a complete region , not just the clear gaps in the HadCRUT4 evaluation (and choosing one data set as the basis for making a point), any 'conclusion' would be unproven. That would be similar to reviewing the global average surface temperature trend since 1998 without considering the 'noise' of things like the ENSO and volcanic events (which is exactly the point of this article). After an ENSO as powerful as the 1997/98 event, with as little volcanic dimming as occurred in that time period, it would be possible to come to a reasonable 'conclusion' about a 'warming trend rate'. Until then any claims based just on the values would be unfounded.

The point of my larger comment was a reasoned explanation of the persistence of unfounded questions being raised, even after they have been answered. Some people do not want this issue to be better understood. Some people just want to 'raise doubts' any way they can get away with for as long as they can get away with.

Here's a suggestion: stop pontificating about people's motives. That will not and should not win a science debate. What counts is the hypotheses they put forward and the facts they use to support those hypotheses and your countering hypothesis and facts.

If you think the question is unfounded then address it with facts, not discussion or inuendo about the morality of people who don't agree with you.

Klapper... Perhaps you actually missed that One Planet did address you with facts and support. You're merely finding a way to wave off One Planet's comments without addressing them.

Clearly, one very important element of "facts" is the capacity to see them when they're presented. A consistent theme with you seems to be a willingness to ignore some facts and elevate other facts beyond their relative importance.

"Reality" is the balance of all the data and facts, not just the one's you prefer. This is a theme eloquently presented by Tamino in a recent post.