18 December 2014

As I've occasionally mentioned on Twitter, we here at the University of Colorado's Center for Science and Technology Policy Research are proposing to turn our graduate certificate program into a full-fledged professional master's program.

16 December 2014

Thanks much to David Appell in the comments, who motivated me to take a second look at this analysis. I had been using a subset of total gasoline consumption in an earlier analysis, and he correctly points out that a more comprehensive measure is better. So this post has subsequently been revised.

Today the US EIA announced that projected household gasoline consumption in 2015 is expected to be the lowest in 11 years. That motivated me to update a graph I did a few years ago on the gasoline intensity of the US economy, defined as the total expenditures at the pump by US consumers as a proportion of overall GDP. (Note: I use annual values here and ignore higher frequency variations.)

That graph is shown above for the period 1976-2015, and shows gasoline as a percentage of overall GDP. The data comes from the US EIA (gasoline product supplied and prices) and the White House (GDP). Data for 2015 are obviously projections.

The data shows that as recently as 2011 (and really, much of the past decade) spending on gasoline, as a proportion of GDP, was similar to what it was in the mid-1980s. In 2015 that proportion is expected to be 40% of that in 1980 and more than a third less than what it was in 2010.

The bottom line here is that the gasoline intensity of the US economy is lower, by a long shot, than at any time in recent history. That is good news.

This week I'll be focusing on issues of science advice to governments, as I attend the "Science Advice to Governments" conference in Auckland, New Zealand. It is being characterized as a science advice "summit." I'll be participating on a panel focused on "Science advice in the context of opposing political / ideological positions" along with the chief scientific advisor to the Australian government, Ian Chubb, and the chief scientific advisor to Defra, Ian Boyd, among others.

The conference is being convened by ICSU and hosted by Sir Peter Gluckman, chief science advisor to the New Zealand government. Not long ago James Wilsdon previewed the conference:

The summit will take place in a year when we've seen important debates in scientific advisory systems worldwide. In the UK, Sir Mark Walport is about to mark his first year as Government Chief Scientific Adviser, during which he has had to tread a careful path through controversies over bees, badgers, fracking and flooding. In Brussels, Anne Glover, Chief Scientific Adviser to the President of the European Commission, has been working tirelessly to persuade more EU member states to appoint national scientific advisers, with a view to establishing an EU-wide network. In Japan, three years on from the Great East Japan earthquake and nuclear meltdown at Fukushima, arguments continue about how to reform structures for scientific advice and risk management. And at the United Nations, a new Scientific Advisory Board, hosted by UNESCO, held its inaugural meeting at the end of January 2014.

All of this suggests that the Auckland meeting couldn't be happening at a better time. Sir Mark Walport, Anne Glover, and their equivalents from India, Malaysia, Japan, Germany, Australia and the Philippines, will be among those attending. Scientific advisers, policymakers, academics – and anyone with an interest in these debates – is invited to register, or to follow developments online.

24 August 2014

With news of a 6.0 magnitude earthquake today in San Francisco, I thought I'd provide a perspective on historical damage, The data in the table below are estimates of normalized damage for the top 15 14 events in our dataset -- from Vranes and Pielke 2009 (PDF), which I have quickly updated to 2014 values. A normalization seeks to estimate how much damage would occur if a past event occurred with today's level of wealth and development.

There are a lot of uncertainties in earthquake normalization methods, and those interested in digging deeper should have a look at our paper for the gory details. The top event is the 1906 San Francisco earthquake, which reminds us that while big earthquakes are rare, they can do lots of damage. For perspective, a repeat of the 1906 San Francisco earthquake could cause more than twice the damage caused by all US tornadoes since 1950.

20 August 2014

There are lots of interesting new studies emerging on trends in disasters and extreme events, and their possible relation to changes in climate, human-caused or otherwise. A new paper in the Hydrological Sciences Journal (Stevens et al. here in PDF) finds no evidence for an increase in UK flooding, once the data is normalized for exposure.

The authors conclude:

Consequences are the combined results of high river flows, pluvial flooding and coastal flooding, the numbers of people and property exposed to flooding and the effects of flood defence construction and floodplain management policies. The increase in the total number of reported flood events in the 20th century in the UK appears to be a function of the gradual increase in exposure due to urban expansion and population growth. However there is also greater capacity to report flood events. The number of reported ‘Class 3’ flooding events has remained static or decreased slightly over the 20th Century. This is despite the UK population almost doubling and the number of dwelling houses tripling over the same time period.

There is no clear underlying trend in flood reports present in the UK flood data when it is normalised for exposure. Pielke Jr. and Landsea (1998) studied damage caused by hurricanes in the USA. They also found that normalising damage reports to take account of exposure removed the upward trend of losses over time and only left a large decade to decade variation in losses. The lack of a systematic trend in the normalised UK total flood count mirrors these findings. It is also in agreement with studies of trends in river flows (Robson 2002).

As frequent readers here will appreciate, the best way to evaluate the fidelity of any normalization approach is to compare trends in the normalization with trends in the geophysical events. That checks out in this study.

16 August 2014

The Belgian think tank Bruegel points to data showing that the United Kingdom's GDP has returned to pre-economic crisis levels, as shown above. This allows us to do a quick and intuitive examination of how much the UK economy has decarbonized over that time period, and how that rate of decarbonization compares to that implied by the UK Climate Change Act.

As a refresher, decarbonization refers to the rate of decline in carbon dioxide emissions to GDP. In order for the UK to hit the targets prescribed in the UK Climate Change Act for 2022, it will need to achieve consistently an annual rate of decarbonization of more than 3%, for any GDP growth rate greater than 1% per year. For more detail, and a full exploration of the quantitative implications of the UK Climate Change Act for decarbonization of the British economy, see my 2009 paper in ERL (open access).

With the UK GDP in 2014 at the same level as it was in 2008, it allows us to calculate a simple rate of decarbonization, as it will be exactly equal to the annual rate of emissions decline.

The 12 month (ending 2nd quarter 2008) carbon dioxide emissions for the UK for 2008 was 536.1 million metric tonnes (data here in XLS). The trailing 12 month (ending first quarter 2014) carbon dioxide emissions for the UK for 2014 was 507.9 million metric tonnes (data here in XLS).

These data imply a rate of decarbonization of -0.9% per year. This is far less than would be needed to hit the targets of the UK Climate Change Act. Last year I calculated an update of the UK decarbonization rate through 2012, which arrived at a similar result. That calculation is shown below.

It is also possible to express the magnitude of the challenge of meeting the targets of the UK Climate Change Act in more intuitive terms. The graph below shows how much carbon-free energy (not electricity) would need to be deployed by 2020 assuming constant demand to 2022.

In my 2009 paper, which was written upon passage of the UK Climate Change Act in 2008, I concluded:

The approach to emissions reduction embodied by the Climate Change Act is exactly backwards. It begins with setting a target and then only later do policy makers ask how that target might be achieved, with no consideration for whether the target implies realistic or feasible rates of decarbonization. The uncomfortable reality is that no one knows how fast a major economy can decarbonize. Both the 2022 interim and 2050 targets require rates of decarbonization far in excess of what has been observed in large economies at anytime in the past. Simply making progress to the targets requires steps of a magnitude that seem practically impossible, e.g., such as the need for the UK to achieve a carbon efficiency of its economy equal to that of France in 2006 in a time period considerably less than a decade.

Further, the focus on emissions rather than on decarbonization means that it would be very easy for policy makers to confuse emissions reductions resulting from an economic downturn with some sort of policy success (cf, McGee 2009). However, as implicit in the Kaya identity, a lower GDP does very little to change the role of energy technology in the economy. So during a downturn emissions may level off or even decrease as policy makers of course seek to preserve (and even accelerate) economic growth. Consequently, a more directly useful metric for policy success for efforts to stabilize carbon dioxide concentrations in the atmosphere is the decarbonization of the economy, represented in terms of carbon dioxide emissions per unit GDP.

A focus on decarbonization as the central goal of carbon policy rather than emissions reductions means that to achieve specific stabilization targets the rate of decarbonization of the UK economy must not only exceed the rate of economic growth, but it must exceed rates of decarbonization observed historically in the UK and in other developed countriesNote5. Because no one knows how fast a large economy can decarbonize, any policy (or policies) focused on decarbonization will have to proceed incrementally, with constant adjustment based on the proven ability to accelerate decarbonization (cf Anderson et al 2008). Setting targets and timetables for emissions reductions absent knowledge of the ability to decarbonize is thus just political fiction. . .

The failure of the UK Climate Change Act is yet to be broadly recognized, but when it is, it will provide an opportunity to recast carbon policies in a more effective manner.

How do we know if the normalization is any good? We compare it to trends in hurricane frequency and intensity, and when we do so we find no evidence for a residual bias in our methods. Specifically, US hurricanes have not become more frequent or intense, so there is simply no basis to expect an increase in normalized losses. Of course, this analysis has been replicated several times as well, using different methods and loss data.

Here is that data on trends in US hurricane landfall frequency and intensity:

29 July 2014

The White House released a report today on "The Costs of Delaying Action to Stem Climate Change" (here in PDF). The report concludes (based on a summary by William Nordhaus in his book, The Climate Casino):

Based on a leading aggregate damage estimate in the climate economics literature, a delay that results in warming of 3° Celsius above preindustrial levels, instead of 2°, could increase economic damages by approximately 0.9 percent of global output.

The report seeks to place 0.9% of output into context by presenting it in terms of the US economy in 2014:

To put this percentage in perspective, 0.9 percent of estimated 2014 U.S. Gross Domestic Product (GDP) is approximately $150 billion.

The New York Times, mistakenly, assumes from this that the future impacts of climate change will be $150 billion:

Failing to adequately reduce the carbon pollution that contributes to climate change could cost the United States economy $150 billion a year, according to an analysis by the White House Council of Economic Advisers released on Tuesday.

To be fair, the $150 billion as cost of climate change was repeated by many media outlets. Does anyone read the report?

The actual impacts would be much larger, since the US (and global) economy will be larger in the future. But the White House, nor the media, mention this. Here is why.

Let's assume that the US economy grows at 2% per year to 2100 (both 2% and 2100 are round numbers, feel free to pick others if you like them better). That means that the US GDP will be $82.4 trillion in 2100. From that perspective, the cost of climate change will be an astounding $741 billion!

But that is not right either, as the cost reported by the White House was the marginal cost of going from a global temperature increase of 2 degrees Celsius to 3 degrees. According to Nordhaus (Figure 22 in the Climate Casino), the total damage cost of a 3 degree C increase is more like 3.2% of GDP. That equates to a cost of climate change of $2,635 billion! Now we are talking.

So why isn't that huge number presented?

Well, informing people that US GDP in 2100 will increase from $15 trillion today to only $79.7 trillion in 2100, rather than $82.4 trillion, due to the effects of a 3 degree C climate change doesn't sound so scary. This is why William Nordhaus wrote of these estimates in The Climate Casino:

The first surprise is that, for the range of changes that have been calculated, the estimated impacts of climate change are relatively small.

Postscript: For any trouble makers looking to misrepresent my views, I presented a more in depth analysis along these lines before the House Science Committee in 2007 (here in PDF). In that testimony I concluded:

Mitigation provides benefits under all scenarios discussed here, and almost all scenarios presented by the IPCC. According to the IPCC these benefits increase as the time horizon extends further into the future... nothing in this testimony should be interpreted as being opposed to or contrary to the mitigation of greenhouse gases. To the contrary, under all scenarios discussed here the benefits of mitigation exceed its costs. Mitigation is good policy, and many decision makers are now coming to understand that it is good politics, as well.

28 July 2014

If you are curious about my views on no longer writing for FiveThirtyEight and my experiences in the climate debate, Keith Kloor interviews me here. If there is something that you wish Keith had asked but didn't, feel free to use the comments here.

25 July 2014

So you don't like the advice that your expert advisor is giving? So then how about sacking the advisor? Even better yet, how about getting rid of the entire advisory mechanism?

That is the advice that Greenpeace and other NGOs are giving to European Commission president-elect Jean-Claude Junker. If it sounds a bit like something out of the Richard Nixon playbook, well, it is.

I defend the EU science advisor and the broader structure over at the Guardian. Read it here.

For a deeper dive into science advice in government, see this volume by James Wilsdon and Rob Doubleday. My chapter in that volume on the role of the science advisor can also be found here in PDF.

22 July 2014

A new paper appeared in Climatic Change this week by Visser et al. which looks at disasters and climate change (open access here). Like other studies and the IPCC assessment, Visser et al. find no trends in normalized disaster loses, looking at several metrics of economic and human losses.

They conclude:

The absence of trends in normalized disaster burden indicators appears to be largely consistent with the absence of trends in extreme weather events. This conclusion is more qualitative for the number of people killed. As a consequence, vulnerability is also largely stable over the period of analysis.

The top line conclusion here is not surprising, though it is interesting because it uses independent methods on largely independent data. It is consistent with previous data and analyses (e.g., Bouwer 2011, Neumayer and Bartel 2011, Mohleji and Pielke 2014) as well as with the conclusions of the recent IPCC assessments (SREX and AR5).

What is perhaps most interesting about this new paper is their discussion of vulnerability. Some have argued that our methodological inability to fully account for possible changes in vulnerability to losses over time may mask a climate change signal in the data. (It's gotta be there somewhere!) This line of argument has always been suspect, because there are not relevant trends in phenomena such as floods and hurricanes which would lead to an expectation of increasing normalized losses.

Visser et al. take this issue on and offer several explanations as to why vulnerability does not mask any hidden signals:

Firstly, global disaster management initiatives have only recently been put in place. The Hyogo Framework for Action (HFA) was adopted by 168 Member States of the United Nations in 2005 to take action to reduce vulnerabilities and risks to disasters (UNISDR, 2011). Although these highly important efforts will certainly pay off in the near future, it is unclear whether they are reflected in the sample period chosen for this study. Similar conclusions are drawn in IPCC (2014). . .

Secondly, it is unclear to what extent adaptation measures work in practice. Heffernan (2012) argues that many countries, and even the richest, are ill-prepared for weather extremes. As an example, he names Hurricane Sandy, which wreaked a loss of 50 billion USD along the northeast coast of the US in 2012. As for early warning systems, Heffernan states that not all systems are functioning well. For example, in 2000, Mozambique was hit by a flood worse than any in its history, and the event was not at all anticipated. Warnings of above-average rainfall came too late and failed to convey the magnitude of the coming flood.

Thirdly, a positive trend in vulnerability may be offset by the increasing number of people moving from rural to urban environments, often situated in at-risk areas (UN 2012). Since many large cities lie along coastlines, these movements will make people more vulnerable to land-falling hurricanes (Pielke et al. 2008), coastal flooding and heatwaves (due the urban heat island effect). With regard to economic losses, Hallegatte (2011) argues that these migration movements may have caused disaster losses to grow faster than wealth.

Fourthly, it is unclear how political tensions and violent conflicts have evolved over large regional scales since 1980. On the one hand, Theisen et al. (2013) show that the number of armed conflicts and the number of battle deaths have decreased slightly at the global scale since 1980. On the other hand, these methods are rather crude as far as covering all aspects of political tensions are concerned (Leaning and Guha-Sapir et al. 2013).

We conclude that quantitative information on time-varying vulnerability patterns is lacking. More qualitatively, we judge that a stable vulnerability V t, as derived in this study, is not in contrast with estimates in the literature.

In short, those who claim that a signal of human caused-climate change is somehow hidden in the disaster loss record are engaging in a bit of unjustified wishful thinking. The data and evidence says otherwise.

The bottom line? Once again, we see further reinforcement for the conclusion that there is no detectable evidence of a role for human-caused climate change in increasing disaster losses. In plain English: Disaster losses have been increasing, but it is not due to climate change.

Several weeks ago, I had several phone conversations with Mr. Wines about the work of John Christy. In those conversations, I emphasized the value of skepticism in science and also said that I agreed with some elements of John's point of view, in particular, that projections are still highly uncertain, that climate models leave a great deal to be desired, and that some of the decisions that have to be made about how to deal with climate change are very tough indeed. Wines asked me to explain where I differ from John. I told him that we differ primarily in our assessment of the magnitude of climate tail risk. Wines asked me to explain what I meant by "tail risk", and I offered the metaphor of advising a small girl whether she should cross a busy street to catch her bus (a metaphor I have used before).

Unfortunately, the positioning of the quotation within the article makes it seem as though I am suggesting that John is the kind if person who would let the girl take the risk. I state here that I have absolutely no reason to question John's motives; indeed, he strikes me as the sort of person who would risk his own life to save a child who wandered into a busy street. My metaphor was intended only to illustrate the nature of tail risk.

16 July 2014

Munich Re has just released their tabulation of disaster losses for the first half of 2014. I thought I'd use the occasion to update the dataset shown above. The graph above shows global weather disasters as a proportion of global GDP. Note that 2014 represents January-June. I assume that the first half of 2014 global GDP is 2.5% higher than 2013. I also assume that total 2014 losses to date are all due to weather. Both assumptions err on the conservative side of things. Enjoy!

15 July 2014

The graph above is the most recent update of the normalized disaster loss database for Australia, sent to me by Ryan Crompton of Risk Frontiers at Macquarie University. Ryan also sends this explanatory text:

Crompton and McAneney (2008) normalised Australian weather-related insured losses over the period 1967-2006 to 2006 values. Their methodology adjusted for changes in dwelling numbers and values (excluding land value) and in a marked point of departure from previous normalisation studies, they applied an additional adjustment for tropical cyclone losses to account for improvements in construction standards mandated for new construction in tropical cyclone-prone parts of the country. These were introduced around the early 1980s following lessons learnt from the destruction of Darwin by Tropical Cyclone Tracy in 1974 (Mason et al. 2013).

Crompton and McAneney (2008) emphasise the success of improved building standards in reducing building vulnerability and thus tropical cyclone wind-induced losses. Figures 1a and b show the annual aggregate losses and the annual aggregate normalised losses (2011/12 values) for weather-related disasters. These figures are updated from Crompton and McAneney (2008) using a refined methodology described in Crompton (2011).

Note that the 2012 and 2013 values (shown in yellow in the bottom graph) have not been normalized back to 2011/2012 values. They are shown as reported. Once normalized they will be a bit lower, so as presented they overestimate them 2012 and 2013 losses, but not by a large amount.

This is actually kind of wonderful, in a bang-your-head-on-the-table sort of way. Pielke isn’t claiming that it’s hard in practice to limit emissions without halting economic growth, he’s arguing that it’s logically impossible. So let’s talk about why this is stupid.

On Twitter @FabiusMaximus01 sums it up perfectly:

@RogerPielkeJr So people from Left & Right can find common ground! Unfortunately by disagreeing with arithmetic.
— Fabius Maximus (Ed.) (@FabiusMaximus01) July 10, 2014

07 July 2014

The graphs above come from this post at The Breakthrough Institute by Jon Fisher of The Nature Conservancy. The graphs show something profound in global agriculture: the world is producing more food per person on less land. While there are caveats and details that are important, overall these twin trends are good news. Do head over to @TheBTI and read Fisher's excellent discussion.

The graph above shows the carbon intensity of global energy consumption from 1965 to 2013. Specifically, it shows the amount of carbon emissions (in tons) for every "ton of oil equivalent" consumed in the global economy. Thus, the consumption data includes both carbon intensive sources of energy (coal, gas, oil) and also the less carbon intensive sources (hydro, wind, solar, nuclear, etc.).

The graph shows that global energy consumption decarbonized at a remarkably steady rate from 1965 to the late 1990s. Since then, global energy consumption has become slightly more carbon intensive. In 2013 the carbon intensity of global energy consumption was just about the same as it was in 1991. Since 1999, this metric of carbon intensity has increased by 1.5%. The graph indicates that in the 21st century, whatever gains are being made by low carbon energy technologies, they continue to be equaled or even outpaced by continuing gains in fossil fuels.

To place this analysis in perspective: Cutting global carbon dioxide emissions by 50% (just to pick a round number) while increasing global energy consumption by 50% (another round number) implies a carbon intensity of 0.25 tons carbon per ton of oil equivalent.

For those wanting to explore a little deeper into why this analysis matters for how we might think about climate policies, have a look at this paper in PDF.

The proportion of carbon-free energy consumption is a far more important metric of progress with respect to the challenge of stabilizing carbon dioxide levels in the atmosphere than looking at carbon dioxide emissions. The reason for this is that emissions are a consequence of energy consumption, and the way that we influence emissions is through energy technologies and their use in the economy. So looking directly at energy consumption is a much more direct and relevant way to understand the technological challenge of emissions reductions. From a policy perspective, looking solely at emissions can easily deceive.

In 2013 the proportion of carbon-free energy consumption was just about 13%, representing a continuation of no trend in that measure that has continued for more than 20 years. The measure did tick up from 2012 - from 13.1% to 13.3%, to just about equal to what it was in 1999.

To stabilize atmospheric concentrations of carbon dioxide requires that this proportion exceed 90%, independent of how much energy the world ultimately consumes. But don't take my word for it, do the math yourself. The timing of exceeding that 90% threshold will determine the atmospheric concentration level at which stabilization ultimately occurs.

If the increase in the carbon-free proportion from 2012 to 2013 (of 0.17%) is taken as a trend going forward, then the 90% threshold will be exceeded in the year 2465. Fortunately, linear projections of most anything related to future energy are wrong.

What you should take from this however is that there remains no evidence of an increase in the proportion of carbon-free energy consumption even remotely consistent with the challenge of atmospheric stabilization of atmospheric carbon dioxide. Those who claim that the world has turned a corner, soon will, or that they know what steps will get us around that corner are dreamers or fools. We don't know. The sooner we accept that, the sooner we can design policies more compatible with policy learning and muddling through.

13 June 2014

Last week I had a letter in the Financial Times in which I explained the simple but powerful logic of the Kaya Identity for understanding efforts to reduce carbon dioxide emissions (for some nice discussions see here, here and here). The letter was motivated by a proposal floated by a Chinese academic that China should "cap" its emissions in the near term. I used the logic of the Kaya Identity to conclude:

It should thus not come as a surprise that carbon caps have not led to emissions reductions or even limitations anywhere. China will be no different.

In response, Paul Krugman of the New York Times took the opportunity to show great outrage at my letter, calling me a "concern troll" and "stupid." Powerful argumentation I know. Where there was some substance, Krugman made arguments against claims I did not make.

Any near-term regulation of China's greenhouse gas emissions would likely allow for future emissions growth, a senior government official said on Monday, discounting any suggestion of imminent carbon cuts by the biggest-emitting nation.

Sun Cuihua, deputy director of the climate change office at the National Development and Reform Commission, said it would be a simplification to suggest China would impose an absolute cap on greenhouse gas emissions from 2016.

No decision had yet been taken on a cap and the timing of such a measure was under discussion, she said. Several options were being considered and China would choose policies in accordance with its conditions and stage of development.

"Our understanding of the word 'cap' is different from developed countries," Sun told a conference.

09 June 2014

In 1933, Richard Gray, a U.S. government weather forecaster, noted that Florida had been hit by at least 37 hurricanes over the 45 years ending in 1930. During this period, the longest stretch with no tropical storms was only two years.

When the 2014 hurricane season officially began on June 1, the Sunshine State had gone more than eight years without being struck by a hurricane. It was back on Oct. 24, 2005, when Hurricane Wilma emerged from the Gulf of Mexico and caused billions of dollars in damage in South Florida. In fact, Wilma was the last Category 3 or stronger storm to hit the USA.

The 3,151 days and counting with no Florida hurricane and no major U.S. hurricane shatters the previous records for hurricane "droughts," at least back to the turn of the previous century. In fact, from 1900 through 2013, the United States experienced a decrease in hurricane landfalls of more than 20%, and the strength of each year's landfalling storms has also decreased by more than 20%.

The figures at the top of this post show the data from 1900 to 2013 on landfalls (data from NOAA here) and the intensity of storms at landfall (data from NOAA updated from here, courtesy of C. Landsea, NHC). Since 1900 US hurricane seasons have seen more than 20% less landfalls and are more than 20% less intense. In my piece I defer to the IPCC on the emotive topic of hurricanes and climate change.

The main point of the piece is that we shouldn't let the past 9 years of abnormally low hurricane activity lull us into a sense of complacency. It is only a matter of time before the long streak with no US Cat 3+ and Florida hurricanes is broken.

05 June 2014

Paul Krugman, the Princeton professor, NY Times columnist and Nobel Prize winner, has put up a post today at the NYT in response to my letter in today's Financial Times. For years, Krugman has called me names and hid his critique behind vague allusions to my moral turpitude. In typical Krugman fashion he does that today too -- now I'm accused of pretending to hold views that I don't. Um, OK.

But let's set the drama aside and look at some data and analyses, because for the first time Krugman has engaged the substance of one of my arguments. And what he displays does not reflect well on his understanding of the nature of carbon dioxide emissions. So let's look at these issues in some detail.

The Kaya Identity is the centerpiece of the analyses found in The Climate Fix and a lot of my work. It is a very powerful tool for understanding the challenge of emissions reductions. It holds that carbon dioxide emissions are influenced by four factors:

population

GDP per capita

energy intensity of the economy

carbon intensity of energy

As an identity, it is expressed --> CO2 = P * GDP/P * E/GDP * CO2/E

(where P is population and E is energy consumption).

Now, the combination of population and per capita GDP is just GDP. Energy intensity reflects technologies of energy consumption (like cars and buses) and carbon intensity reflects technologies of energy production (like power plants and solar panels). Often it is useful to combine EI and CI into a metric of CO2/GDP, or the "carbon intensity of the economy."

The math here is simple. Increases in GDP, all else equal, mean that CO2 emissions go up. Improvements in technology (that is decreases in EI or CI) mean (all else equal) that CO2 emissions go down. Thus, we have two big levers with which to affect emissions - (a) GDP and (b) technologies of energy consumption and production.

Since my letter to the FT was about proposed hard caps on total CO2 emissions in China, let's illustrate the Kaya Identity with data from China (from 2011, where I find the most recent data).

Using the Kaya Identity in crude fashion tells us that China's CO2 emissions should have increased by ~8.1% in 2011 (that is 11.1 - [0.6+2.4]). Data from EIA shows an increase of just under 9%. So very close.

In order for China to "cap" CO2 emissions would require that the GDP growth rate equal the combined reductions in EI and CI. This can be achieved in two ways.

by bringing GDP down

by accelerating decreases in EI and/or CI

I have long argued that policies designed to purposely reduce economic growth by any meaningful amount are just not going to happen. On this point Krugman would seem to agree. But he also seems to think that increasing the combined rates of EI and CI to 9% per year can be achieved by pricing or some sort of voodoo. It can't because the needed price to achive such drastic rates of decarbonization would impact economic growth. Hence, the iron law.

To emphasize, placing a "cap" on emissions means that EI and CI together must total to the GDP growth rate. For emissions to be reduced, they have to exceed the GDP growth rate. The math here is as simple as it is inevitable.

I explained this in the FT today:

Thus, by definition, a “carbon cap” necessarily means that a government is committing to either a cessation of economic growth or to the systematic advancement of technological innovation in energy systems on a predictable schedule, such that economic growth is not constrained. Because halting economic growth is not an option, in China or anywhere else, and because technological innovation does not occur via fiat, there is in practice no such thing as a carbon cap.

In his response, Krugman displays both a lack of reading comprehension, and utter ignorance of the Kaya Identity, when he translates this as:

Wrong. I clearly explained the logic of the KayaIdentity as I have many, many times, and it has levers beyond economic growth. Strike one.

Krugman also seems to think that my letter has something to do with Obama's power plant regulations:

Still, the power plant policy is what’s in the news and motivate Pielke’s letter.

Wrong again, Paul. The letter was motivated by China's mythical "carbon caps" and in fact was written well before the EPA proposal was even on the table. Here is what I wrote earlier this week about the EPA regulations:

First, lest there be any confusion, I support the regulations and hope that they are implemented.

Wrong again. Strike two.

Krugman keeps it up:

Even more important, there are many ways to generate electricity: coal, gas, nuclear, hydro, wind, solar — and the alternatives to coal are more competitive than ever before. That doesn’t mean that reducing emissions has no cost — but again, the idea that, say, a 30 percent fall in emissions requires a 30 percent fall in GDP is ludicrous.

Indeed, it would be ludicrous had anyone actually said that. I certainly didn't. Strike three.

What we can do is use the Kaya Identity to quantify, for various assumptions of improvements in energy intensity of GDP (e.g., via efficiency gains) how much carbon-free energy needs to be deployed in order to hit a specific concentration target for CO2 emissions.

To achieve stabilization at a 2°C warming, we would need to install ~900 ± 500 MW [mega-watts] of carbon emissions-free power generating capacity each day over the next 50 years. This is roughly the equivalent of a large carbon emissions-free power plant becoming functional somewhere in the world every day. In many scenarios, this pace accelerates after mid-century. . . even stabilization at a 4°C warming would require installation of 410 MW of carbon emissions-free energy capacity each day.

Get that? A nuclear power plant-worth of carbon-free energy per day, every day until 2050. How's that rate of deployment coming Paul?

I'd love to see Krugman's alternative math here -- for hitting a China emissions "cap" or for stabilizing carbon dioxide at a low level. Krugman is very good at calling me names. His analysis of carbon emissions not so much.

03 June 2014

The graph above shows the mix of US electricity sources for 2012 and for 2020 and 2030 under the EPA carbon regulations which were proposed yesterday by the Obama Administration. (You can click on the graph for a bigger version). Sources can be found at the bottom.

Some points:

First, lest there be any confusion, I support the regulations and hope that they are implemented. My general views on using technology standards to stimulate innovation can be found, for example, here. I'd also encourage a close look at Japan's "Top Runner" programs.

The rhetoric surrounding the regulations is tempered by this data, on both sides.

If implemented, they would represent a significant reduction in coal generation from about 39% of the mix to about 33%, a drop of about 15% from total 2012 coal generation (and under different scenarios it could be a bit more or less). The US economy has already seen a larger reduction in coal electricity generation -- a 25% drop from 2005 to 2012 -- and the economy appears to have survived intact.

However, despite this reduction, the overall change to the US electricity mix is best characterized as marginal, rather than revolutionary. This is especially the case from 2020 to 2030 where there is very little projected change in the mix.

The so-called climate benefits of the regulations are thus essentially nil, though I suppose one could gin some up via creative but implausible cost-benefit analyses. Atmospheric carbon dioxide is a stock and flow problem and these proposed regulations make a only a very tiny contribution to the flow side of the equation. That is just math.

The US carbon regulations won't influence future extreme weather or its impacts in any detectable way. Hard to believe I felt compelled to write that.

The non-carbon public health benefits of decreased reliance on dirty coal are the most compelling reasons for the regulations, and they are considerable.

The bottom line is that the regulations are an important step to help motivate the electricity generating sector to move closer to the technological frontier. There will of course be winners and losers, and thus heavy politics. In addition, people love to fight about climate, and to use the climate debate as a political wedge as well as a sledgehammer for all sorts of interests. The proposed EPA regulations will be no different. The reality, at least with respect to the effect of the regulations on the energy sector, will be far more prosaic than the rhetoric.

Sources: EPA, 2014. Regulatory Impact Analysis for the Proposed Carbon Pollution Guidelines for Existing Power Plants and Emission Standards for Modified and Reconstructed Power Plants. Specifically, Panel 1, Table 2-2 here (PDF). Panels 2 and 3 are from the Option 1 (regional) scenario in Table 3-11 in the same source.

22 May 2014

The graph above shows the proportion of the planet in drought, by intensity, 1982-2012. The graph comes from a paper in a new Nature publication called Scientific Data and is open access. You can see it here.

25 April 2014

What we can say with some certainty is that the number of years with very large tornado losses has actually decreased. Consider that from 1950 to 1970 the U.S. saw 15 years with tornado damage in excess of $5 billion a year. From 1993 to 2013 there were only four such years, with three since 2008.

Read the whole thing here. The peer-reviewed paper (cited by IPCC AR5) and data on which the op-ed is based can be found here.

06 March 2014

I have an essay in The New Republic here following up on the "debate" with John Holdren. This post is so that readers can engage with me and ask questions. All are welcome but do note that this blog has a comment policy.

03 March 2014

I have a new piece out today in the Earth Island Journal. It is part of an exchange with John de Graaf, who argues that economic growth must end. My piece explores what it actually means to be anti-growth.

It has become fashionable in some circles to come out against economic growth. Bill McKibben, the author and climate change activist, asserts that “growth may be the one big habit we finally must break.” He adds that this is “a dark thing to say, and un-American.” Such calls for an end to growth are typically advanced in environmental debates and those about economic globalization. But what does it actually mean to be against economic growth? I argue that to be anti-growth actually implies keeping poor people poor.

Shali Mohleji, of the AMS Policy Program, and I have a new paper out, which builds on her dissertation work. In the paper we take a close look at the dataset on global disaster losses, maintain by Munich Re. The folks at Munich Re were extremely helpful in this work.

Our paper focused on dis-aggregating the Munich Re global dataset into its regional and phenonmena components, and then comparing trends among the dis-aggregation with the peer-reviewed literature on disaster losses in regions and for specific phenomena.

The conclusion of the analysis is:

The bottom line here is that a signal of greenhouse gas emissions cannot be found in the aggregate loss data from Munich Re. Those making claims to the contrary should take note.

In recent years claims have been made in venues including the authoritative reports of the Intergovernmental Panel on Climate Change and in testimony before the US Congress that economic losses from weather events have been increasing beyond that which can be explained by societal change, based on loss data from the reinsurance industry and aggregated since 1980 at the global level. Such claims imply a contradiction with a large set of peer-reviewed studies focused on regional losses, typically over a much longer time period, which concludes that loss trends are explained entirely by societal change. To address this implied mismatch, we disaggregate global losses from a widely utilized reinsurance dataset into regional components and compare this disaggregation directly to the findings from the literature at the regional scale, most of which reach back much further in time. We find that global losses increased at a rate of $3.1 billion/year (2008 USD) from 1980–2008 and losses from North American, Asian, European, and Australian storms and floods account for 97% of the increase. In particular, North American storms, of which U.S. hurricane losses compose the bulk, account for 57% of global economic losses. Longer-term loss trends in these regions can be explained entirely by socioeconomic factors in each region such as increasing wealth, population growth, and increasing development in vulnerable areas. The remaining 3% of the global increase 1980 to 2008 is the result of losses for which regionally based studies have not yet been completed. On climate time scales, societal change is sufficient to explain the increasing costs of disasters at the global level and claims to the contrary are not supported by aggregate loss data from the reinsurance industry.

If you would like a pre-publication copy of the full manuscript, please send me an email request.

To accuse an academic of holding views that lie outside the scientific mainstream is the sort of delegitimizing talk that is of course common on blogs in the climate wars. But it is rare for political appointee in any capacity -- the president's science advisor no less -- to accuse an individual academic of holding views are are not simply wrong, but in fact scientifically illegitimate. Very strong stuff.

Given the seriousness of Holdren's charges and the possibility of negative professional repercussions, via email I asked him to substantiate or correct his characterization, to which he replied quite quickly that he would do so in the form of a promised follow-up to the Senate subcommittee.

Here is what I sent him:

Dear John-

I hope this note finds you well. I am writing in response to your characterization of me before the Senate Environment and Public Works Committee's Subcommittee on Oversight yesterday, in which you said that my views lie "outside the scientific mainstream."

This is a very serious charge to make in Congressional testimony about a colleague's work, even more so when it comes from the science advisor to the president.

The context of your comments about me was an exchange that you had with Senator Sessions over my recent testimony to the full EPW Committee on the subject of extreme events. You no doubt have seen my testimony (having characterized it yesterday) and which is available here:
http://sciencepolicy.colorado.edu/admin/publication_files/2013.20.pdf

Your characterization of my views as lying "outside the scientific mainstream" is odd because the views that I expressed in my testimony are entirely consonant with those of the IPCC (2012, 2013) and those of the US government's USGCRP. Indeed, much of my testimony involved reviewing the recent findings of IPCC SREX and AR5 WG1. My scientific views are also supported by dozens of peer reviewed papers which I have authored and which have been cited thousands of times, including by all three working groups of the IPCC. My views are thus nothing if not at the center of the "scientific mainstream."

I am writing to request from you the professional courtesy of clarifying your statement. If you do indeed believe that my views are "outside the scientific mainstream" could you substantiate that claim with evidence related specifically to my testimony which you characterized pejoratively? Alternatively, if you misspoke, I'd request that you set the record straight to the committee.

I welcome your response at your earliest opportunity.

Today he has shared with me a 6-page single space response which he provided to the Senate subcommittee titled "Critique of Pielke Jr. Statements on Drought." Here I take a look at Holdren's response.

In a nutshell, Holdren's response is sloppy and reflects extremely poorly on him. Far from showing that I am outside the scientific mainstream, Holdren's follow-up casts doubt on whether he has even read my Senate testimony. Holdren's justification for seeking to use his position as a political appointee to delegitimize me personally reflects poorly on his position and office, and his response simply reinforces that view.

His response, (which you can see here in full in PDF) focuses entirely on drought -- whereas my testimony focused on hurricanes, floods, tornadoes and drought. But before he gets to drought, Holdren gets off to a bad start in his response when he shifts the focus away from my testimony and to some article in a website called "The Daily Caller" (which is apparently some minor conservative or Tea Party website, and the article appears to be this one).

Holdren writes:

Dr. Pielke also commented directly, in a number of tweets on February 14 and thereafter, on my February 13 statements to reporters about the California drought, and he elaborated on the tweets for a blog post on The Daily Caller site (also on February 14). In what follows, I will address the relevant statements in those venues, as well. He argued there, specifically, that my statements on drought “directly contradicted scientific reports”, and in support of that assertion, he offered the same statements from his July testimony that were quoted by Senator Sessions.

Let me be quite clear -- I did not write anything for "The Daily Caller" nor did I speak or otherwise communicate to anyone there. The quote that Holdren attributes to me - “directly contradicted scientific reports” -- is actually written by "The Daily Caller." Why that blog has any relevance to my standing in the "scientific mainstream" eludes me, but whatever. This sort of sloppiness is inexcusable.

Leaving the silly misdirection aside -- common on blogs but unbecoming of the science advisor to the most powerful man on the planet -- let's next take a look at Holdren's substantive complaints about my recent Senate testimony.

As a starting point, let me reproduce in its entirety the section of my Senate testimony (here in PDF) which discussed drought.

Drought

What the IPCC SREX (2012) says:

“There is medium confidence that since the 1950s some regions of the world have experienced a trend to more intense and longer droughts, in particular in southern Europe and West Africa, but in some regions droughts have become less frequent, less intense, or shorter, for example, in central North America and northwestern Australia.”

For the US the CCSP (2008)20 says: “droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the U. S. over the last century.”21

What the data says:

8. Drought has “for the most part, become shorter, less frequent, and cover a smaller portion of the U. S. over the last century.”22

Figure 8. Figure 2.6 from CCSP (2008) has this caption: “The area (in percent) of area in severe to extreme drought as measured by the Palmer Drought Severity Index for the United States (red) from 1900 to present and for North America (blue) from 1950 to present.”

Note: Writing in Nature Senevirnate (2012) argues with respect to global trends that, “there is no necessary correlation between temperature changes and long-term drought variations, which should warn us against using any simplifications regarding their relationship.”23

21 CCSP (2008) notes that “the main exception is the Southwest and parts of the interior of the West, where increased temperature has led to rising drought trends.”

22 This quote comes from the US Climate Change Science Program’s 2008 report on extremes in North America.

23 http://www.nature.com/nature/journal/v491/n7424/full/491338a.htm

Let's now look at Holdren's critique which he claims places me "outside the scientific mainstream."

Holdren Complaint #1: "I will show, first, that the indicated quote [RP: This one: "“droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the U. S. over the last century.”21"] from the US Climate Change Science Program (CCSP) about U.S. droughts is missing a crucial adjacent sentence in the CCSP report, which supports my position about drought in the American West. . . That being so, any reference to the CCSP 2008 report in this context should include not just the sentence highlighted in Dr. Pielke’s testimony but also the sentence that follows immediately in the relevant passage from that document and which relates specifically to the American West."

What is that sentence in question from the CCSP 2008 report that Holdren thinks I should have included in my testimony? He says it is this one:

"The main exception is the Southwest and parts of the interior of the West, where increased temperature has led to rising drought trends."

Readers (not even careful readers) can easily see Footnote 21 from my testimony, which states:

CCSP (2008) notes that “the main exception is the Southwest and parts of the interior of the West, where increased temperature has led to rising drought trends.”

Um, hello? Is this really coming from the president's science advisor?

Holdren's reply next includes a section on drought and climate change which offers no critique of my testimony, and which needs no response from me.

Holdren Complaint #2: Holdren implies that I neglected to note the IPCC's reference to the fact that drought is a regional phenomena: "Any careful reading of the 2013 IPCC report and other recent scientific literature about on the subject reveals that droughts have been worsening in some regions in recent decades while lessening in other regions."

Again, even a cursory reading of what I quoted from the IPCC shows that Holdren's complaint does not stand up. Here is the full quote that I included in my testimony from the IPCC on drought:

“There is medium confidence that since the 1950s some regions of the world have experienced a trend to more intense and longer droughts, in particular in southern Europe and West Africa, but in some regions droughts have become less frequent, less intense, or shorter, for example, in central North America and northwestern Australia.”

Again, hello? Seriously?

Holdren Complaint #3: Near as I can tell Holdren is upset that I cited a paper from Nature that he does not like, writing, "Dr. Pielke’s citation of a 2012 paper from Nature by Sheffield et al., entitled “Little change in global drought over the past 60 years”, is likewise misleading."

He points to a January 2014 paper in Nature Climate Change as offering a rebuttal to Sheffield et al. (2012).

The first point to note in response is that my citing of a paper which appears in Nature does not provide evidence of my being "outside the scientific mainstream" no matter how much Holdren disagrees with the paper. Academics in the "scientific mainstream" cite peer-reviewed papers, sometimes even those in Nature. Second, my testimony was delivered in July, 2013 and the paper he cites as a rebuttal was submitted in August, 2013 and only published in early 2014. I can hardly be faulted for not citing a paper which had not yet appeared. Third, the 2014 paper that Holdren likes better actually supports the IPCC conclusions on drought and my characterization of them in my Senate testimony.The authors write (PDF):

How is drought changing as the climate changes? Several recent papers in the scientific literature have focused on this question but the answer remains blurred.

The bottom line here is that this is an extremely poor showing by the president's science advisor. It is fine for experts to openly disagree. But when a political appointee uses his position not just to disagree on science or policy but to seek to delegitimize a colleague, he has gone too far.