Tag: climate

Last month, we summarized evidence for the long-term stability of Greenland’s ice cap, even in the face of dramatically warmed summer temperatures. We drew particular attention to the heat in northwest Greenland at the beginning of the previous (as opposed to the current) interglacial. A detailed ice core shows around 6000 years of summer temperatures averaging 6-8oC (11-14oF) warmer than the 20th century average, beginning around 118,000 years ago. Despite six millenia of temperatures that are likely warmer than we can get them for a mere 500 years, Greenland only lost about 30% of its ice. That translates to only about five inches of sea level rise per century from meltwater.

We also cited evidence that after the beginning of the current interglacial (nominally 10,800 years ago) it was also several degrees warmer than the 20th century, but not as warm as it was at the beginning of the previous interglacial.

Not so fast. Work just published online in the Proceedings of the National Academy of Sciences by Jamie McFarlin (Northwestern University) and several coauthors now shows July temperatures averaged 4-7oC (7-13oF) warmer than the 1952-2014 average over northwestern Greenland from 8 to 10 thousand years ago. She also had some less precise data for maximum temperatures in the last interglacial, and they are in agreement (maybe even a tad warmer) with what was found in the ice core data mentioned in the first paragraph.

Award McFarlin some serious hard duty points. Her paleoclimate indicator was the assembly of midges buried in the annual sediments under Wax Lips Lake (we don’t make this stuff up), a small freshwater body in northwest Greenland between the ice cap and Thule Air Base, on the shore of the channel between Greenland and Ellesmere Island. Midges are horrifically irritating, tiny biting flies that infest most high-latitude summer locations. They’re also known as no-see-ums, and they are just as nasty now as they were thousands of years ago.

Getting the core samples form Wax Lips Lake means being out there during the height of midge season.

She acknowledges the seeming paradox of the ice core data: how could it have been so warm even as Greenland retained so much of its ice? Her (reasonable) hypothesis is that it must have snowed more over the ice cap—recently demonstrated to be occurring for the last 200 years in Antarctica as the surrounding ocean warmed a tad.

The major moisture source for snow in northwesternmost Greenland is the Arctic Ocean and the broad passage between Greenland and Ellesmere. The only way it would snow so much as to compensate for the two massive warmings that have now been detected, is for the water to have been warmer, increasing the amount of moisture in the air. As we noted in our last Greenland piece, the Arctic Ocean was periodically ice-free for millenia after the ice age.

McFarlin’s results are further consistent, at least in spirit, with other research showing northern Eurasia to have been much warmer than previously thought at the beginning of the current interglacial.

Global warming apocalypse scenarios are driven largely by the rapid loss of massive amounts of Greenland ice, but the evidence keeps coming in that, in toto, it’s remarkably immune to extreme changes in temperature, and that an ice-free Arctic Ocean has been common in both the current and the last interglacial period.

In a 2012 dissent from a District of Columbia Appellate Court opinion, Supreme Court nominee Brett Kavanaugh acknowledged that “dealing with global warming is urgent and important” but that any sweeping regulatory program would require an act of Congress:

But as in so many cases, the question here is: Who Decides? The short answer is that Congress (with the President) sets the policy through statutes, agencies implement that policy within statutory limits, and courts in justiciable cases ensure that agencies stay within the statutory limits set by Congress.

When an agency claims to discover in a long-extant statute an unheralded power to regulate “a significant portion of the American economy” we [the Court] typically greet its announcement with a measure of skepticism. We expect Congress to speak clearly if it wishes to assign to an agency decisions of vast “economic and political significance.”

Scalia held this opinion so strongly that, in his last public judicial act, he wrote the order (passed 5-4) to stay the Obama Administration’s sweeping “Clean Power Plan.” Such actions occur when it appears the court is likely to vote in a similar fashion in a related case.

This all devolves to the 2007 landmark ruling, 5-4, in Massachusetts v. EPA, that the EPA indeed was empowered by the 1990 Clean Air Act Amendments to regulate emissions of carbon dioxide if the agency found that they endangered human health and welfare (which they subsequently did, in 2009). Justice Kennedy, Kavanaugh’s predecessor, voted with the majority.

Will Kavanaugh have a chance to reverse that vote? That depends on what the new Acting Administrator of the EPA plans to do about carbon dioxide emissions. If the agency simply stops any regulation of carbon dioxide, there will surely be some type of petition to compel the agency to continue regulation because of the 2009 endangerment finding. Alternatively, those already opposed to it might petition based upon the notion that the science has changed markedly since 2009, with increasing evidence that the computer models that were the sole basis for the finding have demonstrably overestimated warming in the current era. It’s also possible that Congress could compel EPA to reconsider its finding, and that a watered-down version might find itself at the center of a court-adjudicated policy fight.

Whatever happens, though, it is clear that Brett Kavanaugh clearly prefers Congressional statutes to agency fiat. Assuming that he is confirmed, he will surely exert his presence and preferences on the Court, including that global warming is “urgent and important,” but it is the job of Congress to define the regulatory statutes.

Hot off the press, in yesterday’s Journal of Climate, Nic Lewis and Judith Curry have re-calculated the equilibrium climate sensitivity (ECS) based upon the historical uptake of heat into the ocean and human emissions of greenhouse gases and aerosols. ECS is the net warming one expects for doubled atmospheric carbon dioxide. Their ECS ranges from 1.50 to 1.56 degrees Celsius.

Nic has kindly made the manuscript available here, so you don’t have to shell out $35 to the American Meteorological Society for a one-day view.

The paper is a follow-on to their 2015 publication that had a median ECS of 1.65⁰C. It was criticized for not using the latest-greatest “infilled” temperature history (in which less-than-global coverage becomes global using the same data) in order to derive the sensitivity. According to Lewis, writing yesterday on Curry’s blog, the new paper “addresses a range of concerns that have been raised about climate sensitivity estimates” like those in their 2015 paper.

The average ECS from the UN’s Intergovernmental Panel on Climate Change (IPCC) is 3.4⁰C, roughly twice the Lewis and Curry values. It somehow doesn’t seem surprising that the observed rate of warming is now running at about half of the rate in the UN’s models, does it?

Lewis and Curry’s paper appeared seven days after Andrew Dessler and colleagues showed that the mid-atmospheric temperature in the tropics is the best indicator of the earth’s energy balance. This means that any differences between observed and forecast midatmospheric temperatures there can be used to adjust the ECS.

Late last year, University of Alabama’s John Christy and Richard McNider showed that the observed rate of warming in the tropical mid-atmosphere is around 0.13⁰C/decade since 1979, while the model average forecast is 0.30⁰C/decade. This adjusts down the IPCC’s average ECS to the range of 1.5⁰C (actually 1.46⁰).

That’s three estimates of ECS all in the same range, and all approximately half of the UN’s average.

It seems the long-range temperature forecast most consistent with these findings would be about half of what the IPCC is forecasting. That would put total human warming to 2100 right around the top goal of the Paris Accord, or 2.0⁰C.

Stay tuned on this one, because that might be in the net benefit zone.

Last August, the Federal Register announced a period of public commentary on information germane to a new set of Corporate Average Fuel Economy (CAFE) standards for the 2022-2025 period. The extant standard, of roughly 50 miles per gallon (MPG) for passenger cars and other light vehicles, was put in place in January 2017, right at the end of the Obama Administration.

It is not surprising that EPA Administrator Scott Pruitt announced the Obama standards are not to stand; we hope our extensive public comments submitted on September 27 exerted some influence on this decision.

We noted:

There is a paradigm-shift occurring in global warming that is highly relevant… It began with the revelation of remarkable and increasing discrepancies between the climate models…in the most recent report of the U.N’s Intergovernmental Panel on Climate Change (IPCC), and observations in the bulk atmosphere over vast swaths of the planet.

Figure 1 (below) shows this discrepancy.

Figure 1. Average of the IPCC computer model projections for the tropical mid-troposphere versus three standard sets of observations: weather balloons, temperature sensed from satellites, and “reanalysis” data used to initialize the daily weather map. The growing discrepancy is obvious, even with the 2016 El Nino warm spike at the end.

Figure 2 (above) shows the problem in the vertical tropics, where as much as seven times as much warming has been predicted at high altitude since the satellites became operational in 1979.

Figure 2 has enormous consequences for weather. It is the vertical distribution of temperature that determines how much moisture is wafted sky ward in the fetid tropics. Much of the extratropical temperate zone depends upon this juice.

It is noteworthy that models in general predict the greatest amounts of future warming, while observationally-based studies, often about interglacial-glacial transitions, or differences between geological eras, tend to come up with less warming. Given data or a model, most folks will pick the former.

We detail our reasons for re-examining the 2022-25 CAFÉ standards here. Have a look and we think you’ll agree that Administrator Pruitt has pretty sound science behind the need to revisit standards that were generated absent some of this very important data.

Scientific American, which is quite reliably alarmed by the prospects of climate change, showed signs of moderation this week in an article highlighting the work of the ecomodernists. The ecomodernists acknowledge that man-made climate change is occurring, but believe humans are already (and will continue) decoupling their well-being from environmental destruction—meaning every day that passes, human flourishing requires less pollution and resources. Though not libertarians, they are spot-on in regards to climate change being a minor overlay in a world increasingly insulated from the vagaries of nature due to market forces. The piece, titled Should We Chill Out about Global Warming?, is answered with an unqualified YES! from those of us at the Center for the Study of Science.

One of their ecomodernist peers, journalist Will Boisvert, recently pondered in a piece, “How bad will climate change be?” He has a voluminous response that’s worth a read, which he quickly summarizes as “Not very.” He went on to note what many of us have been saying for years—as long as there has been capital for innovation and civil order, we’ve been adapting to climate change, and will continue to do so. Boisvert neatly skewers horseman after horseman of the apocalypse—drought, hunger and heat, and notes our increasingly clean and efficient energy technology.

Public comments on the draft fourth “National Assessment” of present and future climate change impacts on the U.S. are due at 11:59 PM tonight and will be embargoed from public release until after then. As soon as it is made public, we’ll link to our comments. Until then, just think about the previous three Assessments.

Reviewing the first one in 2000, myself and Chip Knappenberger discovered that the science team just happened to choose the two most extreme models (for temperature and precipitation) out of the 14 they considered. And then we discovered that they were worse than bad: when applied to a really simple record of temperature, they performed worse than a table of random numbers. Really, it was the same situation as if you took a multiple choice test with four possible answers, and somehow managed to get less than 25% right. That’s the highly sought after “negative knowledge,” something you might think impossible!

The second one (2009) was so bad that we covered it with a 211-page palimpsest, a document that looked exactly like the federal original in both design and content. Except that it contained all the missing science as well as correcting as many half-truths and incomplete statements as we could find. Like we said, that took 211 pages of beautiful typeset and illustrated prose.

The National Oceanic and Atmospheric Administration was instrumental in producing the third (2014) Assessment, and in their press release at its debut, gushed that “it is a key deliverable in President Obama’s Climate Action Plan.” That has been recently undelivered.

So what did we say in our review of the upcoming fourth one? Well, you’ll have to wait until tomorrow.

UPDATE: comments by Ryan Maue and myself are now available on the Cato website.

Yesterday Jim Hansen, now with Columbia University, and several of his colleagues released their summary of 2017 global temperatures. Their history, published by the NASA Goddard Institute for Space Studies, has constantly been evolving in ways that make the early years colder and the later years hot. I recently posted on how this can happen, and the differences between these modified datasets and those determined objectively (i.e. without human meddling).

For a couple years I have been pointing out (along with Judith Curry and others) that the latest fad—which puts a lot of warming in recent data—is to extend high-latitude land weather station data far out over the Arctic Ocean. Hansen’s crew takes stations north of 64⁰ latitude and extends them an astounding 1200 kilometers into the ocean.

This, plainly speaking, is a violation of one of the most fundamental principles of thermodynamics, which is that when matter is changing its state (from, say, solid to liquid), a stirred fluid will remain at “freezing” until it is all liquid, whereupon warming will commence.

This also applies in the Arctic, where the fluid is often stirred by strong winds. So if, say, Resolute, one of the northernmost land stations, is 50⁰F, and the Arctic is mixed water-ice (it always is), that 50 degrees will be extended out 1200 kilometers where the air-sea boundary temperature has to be around 30⁰F, the freezing point of seawater up there.

Hansen et al. did pay some attention to this, noting this extension, which they normally apply to their data, was responsible for making 2017 the second-warmest year in their record. If they “only” extended 250km (still dicey), it would drop their “global” temperatures by a tenth of a degree, which would send the year down a rank. The result of all of this is that the big “spike” at the end of their record is in no small part due to the 1200km extension that turns thermodynamics on its head.

There’s another interesting pronouncement in the NASA announcement; many people have noted that the sun is a bit cool in recent years, and that it continues to trend slightly downward. The changes in its radiance are probably good for a tenth of a degree (C) of surface temperature or so. Hansen et al. use this to provide covering fire should warming stall out yet again:

Therefore, because of the combination of the strong 2016 El Niño and the phase of the solar cycle, it is plausible, if not likely, that the next 10 years of global temperature change will leave an impression of a ‘global warming hiatus’.

The significance of this will all fall out in the next year or so. If temperatures head back down all the way to their pre-El Niño levels, that will ultimately bring back the post-1996 “pause.” We’re going to guess they are going to remain a couple of tenths of a degree above that, based on what happened after the big one in 1998, where they settled a small amount above the pre-El Niño of the earlier 1990s.

If the recent warming rate (adjusting for El Niño) continues, we’ll hear that it is doing so “despite” the sun. Given that one year (2018) can have little influence on a recent trendline, that copy may already have been written!

All of this begs the question: Hansen notes in his release that the warming rate since 1970 has been fairly constant, about 0.17⁰C per decade, and didn’t note that the average of the UN’s climate models say it should be about twice that now. More lukewarming.