This site, like many others, uses small files called cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we'll assume that you are happy to receive all cookies on this website (Cookie Policy). However, if you would like to, you can change your cookie settings at any time.

Hurricane risk at high resolution

2008-12-10T00:00:00+00:00

We are using high resolution climate models and supercomputers to assess future hurricane risk to the United States and Caribbean at precision never before seen. By Greg Holland with James Done, Jim Hurrell, David Hosansky and Asuka Suzuki

Assessing hurricane risk has traditionally been accomplished by the application of advanced statistical techniques to existing hurricane archives. Usually, this involves using the archive as a basis for developing a synthetic hurricane data set extending over thousands of years to enable the development of stable statistics on return periods and risk assessment. This approach has been remarkably successful in establishing a benchmark for insurance risk assessment, but it has some major inherent limitations, particularly the quality and length of the available records, and the inherent inability to detect changing climate.

For example, the reliable hurricane data record is rarely more than 50 years long and, for many regions of the globe, it is often much shorter. As a result, the statistical characteristics of extremes, such as the most intense hurricanes or extreme rainfall events, may not be reliably sampled. Changing observing systems and analysis methodologies may mask real short term climate changes or even inter-annual variations, and the data cannot provide an adequate assessment of real trends associated with long-period climate variability or change.

An alternative approach is to use computer climate models to assess future hurricane statistics. Such climate models have formed the basis of the Intergovernmental Panel on Climate Change (IPCC) assessments. However, the current generation of models is inadequate for hurricane assessments; these models generate climate projections by calculating the relevant dynamic and physical laws on a discrete grid over the globe, the spacing of which is dictated by available computing power.

For the last IPCC assessment, the horizontal grid used by the National Center for Atmospheric Research (NCAR) Community Climate System Model had approximately 180km between grid points. This is incapable of developing the high intensity internal core of a hurricane, where the maximum wind belt typically lies at a radius of 20-50km from the centre. It can neither simulate important small scale interactions, at the level of tropical

thunderstorms, that influence cyclone formation, nor account for the impact of tropical cyclones on the larger scale climate. Indeed, our weather forecasting experience has shown that grid spacing of less than 4km is required to resolve adequately the critical internal hurricane characteristics.

Computing power has improved remarkably and continues to follow Moore’s Law of doubling in capacity every two years. Unfortunately, for each halving of the grid spacing, the computing requirements increase 10-fold, so that improving from the 180km to a 4km grid requires a 100,000 increase in computing capacity. This will not be possible for several decades. The next IPCC assessment will include decadal climate predictions with grid spacing of around 50km, but even this is incapable of resolving hurricane intensity and structural characteristics.

An alternative approach

We are trialling an alternative approach by combining the NCAR climate system and Advanced Research Weather Research and Forecasting computer models to take advantages of the best features of each. The climate system model is being used to project global climate into the future with different greenhouse gas scenarios, and our advanced weather research model is being incorporated into it to zoom in to high resolution over the North Atlantic and North America, a process called nesting from the manner in which the detailed grids fit successively within the coarse, global grid, as Figure 1 shows. The combination of climate and weather models together make up a new computer modelling system, which we have designated the NCAR nested regional climate model.

The nested model is being used to assess issues of concern to the insurance and energy industries, with specific focus on the major reports by the US Climate Change Science Program and the IPCC that have found evidence for a link between global warming and increased hurricane activity. But many questions remain about future hurricane activity. For example, the US programme report concluded that future changes in frequency were uncertain, and that rainfall and wind intensity were likely to increase, but with unknown consequences.

An example of the importance of grid resolution is shown in Figure 2. During the two year testing period for the nested model, we simulated the 2005 hurricane season at 36 and 12km grid spacing. The 36km simulations indicated a very active year, with 18 tropical storms and hurricanes. But this was nowhere near the record number of 27 storms that occurred, and the formation locations were displaced well poleward and westward of the actual developments. Simply adding a further nest at 12km grid spacing increased the number of storms to 28, many of which also developed in the eastern North Atlantic as observed

Based on this experience and testing, we are now simulating hurricane climate statistics for the North Atlantic and North America in three independent ways:

First we have made a continuous simulation of global climate from climate system model over 1950-2055 using three greenhouse gas scenarios designated by the IPCC. Details of North Atlantic hurricanes are then simulated using the nested regional model in a series of embedded nests shown in Figure 1.

An intermediate grid with 36km grid spacing simulates the broad aspects of the climate of Northern Africa, the Atlantic and North America, with special care given to resolve African easterly waves, the trade wind flow and the vertical structure of the winds, all of which are known to be important modulators of hurricane development. A further nesting to 12km grid spacing enables development of a hurricane climatology within the broader scale climate. Finally, special nests at 4km grid spacing will be used to focus on specific important aspects, such as investigating Category 5 hurricanes.

Because of the enormous computing requirements, the nested simulations can only be undertaken for three time periods: 1995-2005 for current climate, and 2020-2030 and 2045-2055 for future climate. A snapshot of a hurricane in the Gulf of Mexico from October 2046 is shown in Figure 3. To complement these direct simulations and fill in the intermediate periods we are applying advanced statistical downscaling methods to the global climate simulation from 1950-2055. By using three separate sets of simulations using different greenhouse gas scenarios, we will also be able to provide some basic statistics on overall uncertainty in the hurricane assessments.

This is a massive logistics and computing exercise, involving a full year of preparation and simulation and six months dedicated 24/7 computer facility time. This is placing intense demand on bluefire, NCAR’s flagship supercomputer. Manufactured by IBM, it is ranked as one of the 50 most powerful computers on earth, and can solve up to 76 trillion equations every second. The data generated by the NRCM simulations will exceed 350 terabytes, equivalent to four times the entire world-wide web data captured and archived by the US Library of Congress.

To enable this massive simulation, the hurricane project is part of a larger effort examining regional climate change between 1995 and 2055 across North America. The simulations are being run on the bluefire supercomputer with support from the National Science Foundation, NCAR’s sponsor, and a long term partnership with the insurance industry through the Willis Research Network collaboration between insurance and academia.

Additional backing is provided by the Research Partnership to Secure Energy for America, a non-profit consortium that includes the US Department of Energy and several energy companies. The simulations were still underway at the time of writing and a preliminary analysis of the outcomes will be presented in a future issue of Catastrophe Risk Management.

These computer simulations provide a potentially valuable additional tool in support of catastrophe modelling. Our longer term goal in the Willis Research Network is to combine the traditional catastrophe modelling approach of using statistical analysis of archived hurricane data, with this new generation of high resolution climate models and sophisticated statistical downscaling techniques. Our expectation is that by combining these independent approaches we can arrive at a much improved assessment of future hurricane return periods and an unprecedented degree of detail on associated structural features such as rainfall, size and geographical location.

Postscript

Greg Holland is a senior scientist at the National Center for Atmospheric Research (NCAR) and member of the Willis Research Network

James Done who is a Willis research fellow, is a project scientist at NCAR. Jim Hurrell is a senior scientist at UCAR and David Hosansky is a member of the communications team. Asuka Suzuki is a graduate student at Georgia Institute of Technology, Atlanta, Georgia.

gholland@ucar.edu

www.mmm.ucar.edu

www.willisresearchnetwork.com.

Topics

Related articles

During the 2008 hurricane season, the number of tropical storms and hurricanes exceeded long term averages. The year produced two events which resulted in serious damage in the United States; Hurricanes Gustav and Ike, and an unusually strong late storm, Hurricane Paloma. By Peter Dailey