This week I'll be focusing on issues of science advice to governments, as I attend the "Science Advice to Governments" conference in Auckland, New Zealand. It is being characterized as a science advice "summit." I'll be participating on a panel focused on "Science advice in the context of opposing political / ideological positions" along with the chief scientific advisor to the Australian government, Ian Chubb, and the chief scientific advisor to Defra, Ian Boyd, among others.

The conference is being convened by ICSU and hosted by Sir Peter Gluckman, chief science advisor to the New Zealand government. Not long ago James Wilsdon previewed the conference:

The summit will take place in a year when we've seen important debates in scientific advisory systems worldwide. In the UK, Sir Mark Walport is about to mark his first year as Government Chief Scientific Adviser, during which he has had to tread a careful path through controversies over bees, badgers, fracking and flooding. In Brussels, Anne Glover, Chief Scientific Adviser to the President of the European Commission, has been working tirelessly to persuade more EU member states to appoint national scientific advisers, with a view to establishing an EU-wide network. In Japan, three years on from the Great East Japan earthquake and nuclear meltdown at Fukushima, arguments continue about how to reform structures for scientific advice and risk management. And at the United Nations, a new Scientific Advisory Board, hosted by UNESCO, held its inaugural meeting at the end of January 2014.

All of this suggests that the Auckland meeting couldn't be happening at a better time. Sir Mark Walport, Anne Glover, and their equivalents from India, Malaysia, Japan, Germany, Australia and the Philippines, will be among those attending. Scientific advisers, policymakers, academics – and anyone with an interest in these debates – is invited to register, or to follow developments online.

24 August 2014

With news of a 6.0 magnitude earthquake today in San Francisco, I thought I'd provide a perspective on historical damage, The data in the table below are estimates of normalized damage for the top 15 14 events in our dataset -- from Vranes and Pielke 2009 (PDF), which I have quickly updated to 2014 values. A normalization seeks to estimate how much damage would occur if a past event occurred with today's level of wealth and development.

There are a lot of uncertainties in earthquake normalization methods, and those interested in digging deeper should have a look at our paper for the gory details. The top event is the 1906 San Francisco earthquake, which reminds us that while big earthquakes are rare, they can do lots of damage. For perspective, a repeat of the 1906 San Francisco earthquake could cause more than twice the damage caused by all US tornadoes since 1950.

20 August 2014

There are lots of interesting new studies emerging on trends in disasters and extreme events, and their possible relation to changes in climate, human-caused or otherwise. A new paper in the Hydrological Sciences Journal (Stevens et al. here in PDF) finds no evidence for an increase in UK flooding, once the data is normalized for exposure.

The authors conclude:

Consequences are the combined results of high river flows, pluvial flooding and coastal flooding, the numbers of people and property exposed to flooding and the effects of flood defence construction and floodplain management policies. The increase in the total number of reported flood events in the 20th century in the UK appears to be a function of the gradual increase in exposure due to urban expansion and population growth. However there is also greater capacity to report flood events. The number of reported ‘Class 3’ flooding events has remained static or decreased slightly over the 20th Century. This is despite the UK population almost doubling and the number of dwelling houses tripling over the same time period.

There is no clear underlying trend in flood reports present in the UK flood data when it is normalised for exposure. Pielke Jr. and Landsea (1998) studied damage caused by hurricanes in the USA. They also found that normalising damage reports to take account of exposure removed the upward trend of losses over time and only left a large decade to decade variation in losses. The lack of a systematic trend in the normalised UK total flood count mirrors these findings. It is also in agreement with studies of trends in river flows (Robson 2002).

As frequent readers here will appreciate, the best way to evaluate the fidelity of any normalization approach is to compare trends in the normalization with trends in the geophysical events. That checks out in this study.

16 August 2014

The Belgian think tank Bruegel points to data showing that the United Kingdom's GDP has returned to pre-economic crisis levels, as shown above. This allows us to do a quick and intuitive examination of how much the UK economy has decarbonized over that time period, and how that rate of decarbonization compares to that implied by the UK Climate Change Act.

As a refresher, decarbonization refers to the rate of decline in carbon dioxide emissions to GDP. In order for the UK to hit the targets prescribed in the UK Climate Change Act for 2022, it will need to achieve consistently an annual rate of decarbonization of more than 3%, for any GDP growth rate greater than 1% per year. For more detail, and a full exploration of the quantitative implications of the UK Climate Change Act for decarbonization of the British economy, see my 2009 paper in ERL (open access).

With the UK GDP in 2014 at the same level as it was in 2008, it allows us to calculate a simple rate of decarbonization, as it will be exactly equal to the annual rate of emissions decline.

The 12 month (ending 2nd quarter 2008) carbon dioxide emissions for the UK for 2008 was 536.1 million metric tonnes (data here in XLS). The trailing 12 month (ending first quarter 2014) carbon dioxide emissions for the UK for 2014 was 507.9 million metric tonnes (data here in XLS).

These data imply a rate of decarbonization of -0.9% per year. This is far less than would be needed to hit the targets of the UK Climate Change Act. Last year I calculated an update of the UK decarbonization rate through 2012, which arrived at a similar result. That calculation is shown below.

It is also possible to express the magnitude of the challenge of meeting the targets of the UK Climate Change Act in more intuitive terms. The graph below shows how much carbon-free energy (not electricity) would need to be deployed by 2020 assuming constant demand to 2022.

In my 2009 paper, which was written upon passage of the UK Climate Change Act in 2008, I concluded:

The approach to emissions reduction embodied by the Climate Change Act is exactly backwards. It begins with setting a target and then only later do policy makers ask how that target might be achieved, with no consideration for whether the target implies realistic or feasible rates of decarbonization. The uncomfortable reality is that no one knows how fast a major economy can decarbonize. Both the 2022 interim and 2050 targets require rates of decarbonization far in excess of what has been observed in large economies at anytime in the past. Simply making progress to the targets requires steps of a magnitude that seem practically impossible, e.g., such as the need for the UK to achieve a carbon efficiency of its economy equal to that of France in 2006 in a time period considerably less than a decade.

Further, the focus on emissions rather than on decarbonization means that it would be very easy for policy makers to confuse emissions reductions resulting from an economic downturn with some sort of policy success (cf, McGee 2009). However, as implicit in the Kaya identity, a lower GDP does very little to change the role of energy technology in the economy. So during a downturn emissions may level off or even decrease as policy makers of course seek to preserve (and even accelerate) economic growth. Consequently, a more directly useful metric for policy success for efforts to stabilize carbon dioxide concentrations in the atmosphere is the decarbonization of the economy, represented in terms of carbon dioxide emissions per unit GDP.

A focus on decarbonization as the central goal of carbon policy rather than emissions reductions means that to achieve specific stabilization targets the rate of decarbonization of the UK economy must not only exceed the rate of economic growth, but it must exceed rates of decarbonization observed historically in the UK and in other developed countriesNote5. Because no one knows how fast a large economy can decarbonize, any policy (or policies) focused on decarbonization will have to proceed incrementally, with constant adjustment based on the proven ability to accelerate decarbonization (cf Anderson et al 2008). Setting targets and timetables for emissions reductions absent knowledge of the ability to decarbonize is thus just political fiction. . .

The failure of the UK Climate Change Act is yet to be broadly recognized, but when it is, it will provide an opportunity to recast carbon policies in a more effective manner.

How do we know if the normalization is any good? We compare it to trends in hurricane frequency and intensity, and when we do so we find no evidence for a residual bias in our methods. Specifically, US hurricanes have not become more frequent or intense, so there is simply no basis to expect an increase in normalized losses. Of course, this analysis has been replicated several times as well, using different methods and loss data.

Here is that data on trends in US hurricane landfall frequency and intensity: