Tag: statistics

Above: Junkins Fire, southwest of Pueblo, Colorado. Photo provided by the Incident Management Team on October 19, 2016.

There have been many discussions recently on this website, in the scientific community, and in the more public arena about why the number of acres burned in wildfires has been increasing rapidly over the last several decades. From the mid-1980s through 2015 the average number of acres burned has grown from about 2 million acres a year to around 8 million. Some people like to claim that this was caused by climate change, environmentalists preventing timber from being harvested, or other factors. However, to complicate the issue, some data appears to indicate that between 1920 and 1950, 10 million to 50 million acres burned each year.

It is very difficult to say that one factor caused fire occurrence to change. While comparing acres burned in the early part of the 20th century to what we are seeing in recent decades, many variables need to be considered:

–Weather and climate trends. This has been vigorously discussed in many venues.

–The capacity to suppress wildfires. In the first two-thirds of the 20th century the ability of land managers to suppress wildfires was very different from what we have today. Fire engines now carry many times more water, and transportation systems enable quicker initial attack response. Helicopters and air tankers were brought into the equation. Heavy equipment became more prolific and capable. More efficient communication and dispatch systems were created. Fires are detected more quickly. Firefighters are routinely brought in from hundreds or even thousands of miles away.

–The wildland-urban interface is growing. More people are living and recreating where previously there was less human activity. This can increase the number of fire starts, and the endangered structures often have an effect on the priorities of firefighters and where they are deployed, as opposed concentrating forces where they are most likely to contain the fire.

—Changes in how timber is managed and harvested.

–Fire suppression can result in longer fire return intervals and increases in the amount of fuel available for the next fire.

–Fuel treatments: mechanical and prescribed fire.

–Changes in the vegetation: non-native species, insects, disease.

–Accuracy of the fire occurrence data. Can the data for 1920 be compared with the data from 2016? The fire occurrence data at NIFC for 1962 through 2015 shows a huge swing beginning in 1983 and 1984, with the number of fires overnight dropping by about 50%, a trend that continued through 2015. This leads one to lose confidence in the data. One would think that the more modern era, post 1984, would have more accurate information than previous decades.

–Changes in wildfire management policy: full, limited, or no suppression.

Our readers will probably suggest even more factors that affect the number of fires and acres burned.

Considering all of these variables, I am skeptical of reports saying that just one is responsible for changes in the number fires and acres burned.

Above: Acres burned in the United States, 1986 through 2015. Data from NIFC, compiled by Bill Gabbert.

A new study released yesterday concludes that human-caused climate change is responsible for nearly doubling the number of acres burned in western United States wildfires during the last 30 years.

Fires in western forests began increasing abruptly in the 1980s, as measured by area burned, the number of large fires, and length of the fire season. The increases have continued, and recently scientists and public officials have in part blamed human-influenced climate change. The new study is perhaps the first to quantify that assertion. “A lot of people are throwing around the words climate change and fire–specifically, last year fire chiefs and the governor of California started calling this the ‘new normal,’ ” said lead author John Abatzoglou, a professor of geography at the University of Idaho. “We wanted to put some numbers on it.”

Warm air can hold more moisture. As the temperature rises the relative humidity decreases. Low humidity withdraws more moisture out of live and dead plants as well as soil. Plants are the fuel for wildfires and lower moisture means fires can burn more rapidly and with increased intensity and resistance to control. Average temperatures in forested parts of the U.S. West have gone up about 2.5 degrees F since 1970, and are expected to keep rising. The resulting drying effect is evident in the rise of more fires.

Jasper Fire pyrocumulus, about two hours after the fire started, August 24, 2000 west of Custer, South Dakota. NPS photo by Bill Gabbert.

The overall increase in fire since the 1980s is about twice what the researchers attribute to climate change; the rest is due to other factors, they say. One has been a long-term natural climate oscillation over the Pacific Ocean that has steered storms away from the western United States. Another: firefighting itself. By constantly putting out fires, authorities have allowed areas they “saved” to build up more dry fuel, which later ignites, causing ever more catastrophic blazes, the researchers say. The costs of fire fighting have risen sharply in step; last year the federal government alone spent more than $2.1 billion. “We’re seeing the consequence of very successful fire suppression, except now it’s not that successful anymore,” said Abatzoglou.

The authors isolated the effects of climate warming from other factors by looking at eight different systems for rating forest aridity; these included the Palmer Drought Severity Index, the MacArthur Forest Fire Danger Index and the Canadian Forest Fire Danger Rating System. They then compared such measurements with observations of actual fires and large-scale climate models that estimate manmade warming. The crunched data showed that 55 percent of the increase in fuel aridity expected to lead to fires could be attributed to human-influenced climate change. Climate’s role in increasing such aridity has grown since 2000, the researchers say, and will continue to do so.

(The graphic below is from the study.)

The researchers found that anthropogenic climate change accounted for about 55% of observed increases in fuel aridity from 1979 to 2015 across western US forests, highlighting both anthropogenic climate change and natural climate variability as important contributors to increased wildfire potential in recent decades.

Mr. Abatzoglou and coauthor Park Williams, a bioclimatologist at Columbia University’s Lamont-Doherty Earth Observatory, say they do not account for some factors that could be offshoots of climate warming, and thus they may be understating the effect. These include millions of trees killed in recent years by beetles that prefer warmer weather, and declines in spring soil moisture brought on by earlier snowmelt. There is also evidence that lighting may increase with warming.

The study does not cover western grasslands. These have seen more fires too, but there is little evidence that climate plays a role there, said Mr. Abatzoglou; rather, the spread of highly flammable invasive grasses appears to be the main driver.

Mike Flannigan, a fire researcher at the University of Alberta, said that previous studies have tried to understand the effects of climate on fires in parts of Canada, but that nothing had been done for the United States on this scale. “What’s great about this paper is that it quantifies this effect, and it does it on a national scale,” he said.

Worldwide, wildfires of all kinds have been increasing, often with a suspected climate connection. Many see a huge fire that leveled part of the northern city of Fort McMurray, Alberta, this May as the result of a warming trend that is drying out northern forests. Fires have even been spreading beyond, into the tundra, in places where blazes have not been seen for thousands of years. That said, fires are not expected to increase everywhere. “Increased fire in a lot of places agrees with the projections,” said Jeremy Littell, a research ecologist with the U.S. Geological Survey in Anchorage, Alaska. “But in many woodlands, the relationship between climate and fire is not as tidy.”

Many scientists studying the issue believe the growth in U.S. western fires will continue for many years. Mr. Williams and others say that eventually, so many western forests will burn, they will become too fragmented for fires to spread easily, and the growth in fire will cease. But, he says, “there’s no hint we’re even getting close to that yet. I’d expect increases to proceed exponentially for at least the next few decades.” In the meantime, he said, “It means getting out of fire’s way. I’d definitely be worried about living in a forested area with only one road in and one road out.”

The number of hectares burned in Canada this year is far ahead of average.

In the United States the acres burned to date are 46 percent above normal. Statistics from May 20, 2016, the last time the National Interagency Fire Center issued a daily Situation Report, show that 1,551,474 acres had burned, compared to the 10-year to-date average of 1,063,835. Almost two-thirds of that was in the Southern Geographical Area — 955,242. The Rocky Mountain Geographical Area also had a surprisingly high figure — 374,846 acres; but that area includes Kansas, where most of the 397,000-acre Anderson Creek Fire occurred in March.

If you disregard that one huge fire, the total in the U.S. is a lot closer to average. But I’m guessing that some politician somewhere is going to take that 46 percent higher than normal figure and run with it.

These numbers do not include prescribed fires. As of May 20 almost 2 million acres in the U.S. have been visited by prescribed fire.

In February we posted some statistics showing that historically there is a large spike in wildfire activity in March and April in Kansas. The spring is also a time when many, many ranchers conduct prescribed fires in the Flint Hills of Kansas and Oklahoma. This year between February 27 and May 5, 2.7 million acres were treated with prescribed fire.

Referring to the bar graph below, and throwing out the two busiest and the two slowest data points, in a typical year land managers in the Flint Hills burn between 1.1 million and 2.8 million acres.

We thank Eric Ward of the Kansas Forest Service for providing these graphics compiled by the Kansas Department of Health and Environment..

The Texas Forest Service released this graphic showing that more acres have burned this year compared to the same period in 2015. The brown (or orange) bars and line are acres burned in 2016, while blue represents 2015.

The University of Wyoming has issued a publication about the patterns, influences, and effects of wildland fire in the state.

The University of Wyoming paper covers basic facts about fire, weather, intensity, severity, prescribed burning, as well as fire effects and interactions with soils, plants, livestock, wildlife, and bark beetle outbreaks. The document is 16 pages long with an additional 8 pages of references and a glossary. It was written by Derek Scasta, Assistant Professor and Extension Rangeland Specialist.

A couple of items attracted my attention. One is the graphic at the top of this article, the mean fire return interval for Wyoming. If you’re familiar with the geography of an area, data like this can absorb your interest for a while. The map appears to be a section taken out of the whole country map.

Another topic covered in the publication is the relationship between precipitation and acres burned.

The chart above from the paper uses the total Wyoming statewide annual precipitation compared with the total number of acres burned in wildfires each year. We have been thinking that the weather in the summer has a greater effect on acres burned than weather throughout the year. Those weather factors include temperature, relative humidity, wind, and precipitation, and a few others used by the National Fire Danger Rating System. It’s beyond our capacity to analyze all of those, unless we use an index that takes multiple parameters into account, such as the Burning Index or the Energy Release Component.

But what we did (immediately above) was to take one weather parameter from the summer and plotted it on a chart similar to the UofW chart– average monthly precipitation each year for June, July, and August. The weather data came from NOAA, and the acres burned was extracted from the University of Wyoming chart.

Included among the disclaimers is that average precipitation across the state does not apply to every square mile. Thunderstorms in the summer could be hammering one area, while a major fire is burning somewhere else. And, using only precipitation does not take into account temperature, relative humidity, and wind, which are all very important.

If anyone is interested in analyzing the Wyoming fire occurrence data using another weather factor or NFDRS Index (from the summer months), below are the numbers I used. Or, if you’d like to look at another state or geographic area, that would be fine. It’s important to analyze the acres burned and the weather observations for a large area in order get a sample of sufficient size to make it statistically significant. For example, use 15 to 20 years of information from a large national forest with multiple weather stations to reduce the data-skewing impact of a gully-washer thunderstorm at one location.