Precipitation Time Scales

This maproom presents an approximate decomposition by time scale of 2nd-half twentieth-century
precipitation variations.

Three scales are defined, denoted "trend", "decadal" and "interannual". These correspond
loosely to secular variation due to anthropogenic influence and the low- and high-frequency
components of natural variability (variability intrinsic to the climate system), respectively.

The divide between decadal and interannual scales corresponds to a period of 10 years,
so that variability due to the El Niño-Southern Oscillation (ENSO) falls into the
interannual category, while variability on time scales of 10 years or longer is classified
as decadal. The procedures used to separate these signal components, as well as some
cautionary notes regarding their interpretation, are discussed in an accompanying
EOS article and in a more detailed reference document.

A range of analysis and display options is available: The user may define a season
and a seasonal variable of interest, in which case the decomposition will be performed
on that variable on the corresponding season. Results may be displayed either as a
map, or as time series, in the latter case at an individual gridpoint or averaged
over a user-selectable area. Maps may display either the standard deviation or the
percent of variance in the raw data explained by variability on the selected time
scale.

Seasonal Options

Years and Season:
Specify the range of years over which to perform the analysis and choose the start
and end dates of the season , over which the diagnostics are to be performed.

Wet/Dry Day/Spell Definitions:
These define the amount in millimeters (non inclusive) above which a day is considered
to be a wet day (as opposed to dry), and the minimum number (inclusive) of consecutive
wet (dry) days to define a wet (dry) spell.

Seasonal daily statistics: Choose the seasonal diagnostic quantity (i.e the statistic of the daily data) to
be computed for each season, from the following choices.

Total Rainfall: total cumulative precipitation (in mm) falling over the season.

Number of Wet Days: the number of wet days (as defined above) during the season.

Rainfall Intensity:
the average daily precipitation over the season considering only wet days.

Number of Wet (Dry) Spells:
the number of wet (dry) spells during the season according to the definitions in the
Options section. For example, if a wet spell is defined as 5 contiguous wet days,
10 contiguous wet days are counted as 1 wet spell. Note that a spell, according to
the definitions above, that is overlapping the start or end of the season will be
counted only if the part of the spell that falls within the season reaches the minimal
length of consecutive days.

Documentation

The Time scales maproom

Although the decomposition of a signal into trend, low- and high-frequency components
may seem straightforward, the analysis presented involves a number of subtleties.
This document provides a more detailed look at the analytical procedures utilized
than does the overview presented in Greene et al. (2011), and offers a number of caveats
regarding the interpretation of maproom displays.

Method

Data processing consists of two steps: detrending in order to extract slow, trend-like changes and filtering, to separate high and low frequency components in the detrended data. Each of these
steps is described below. Data are processed gridbox by gridbox, meaning that results
in adjacent gridboxes are not compared or combined, except when the user requests
that analysis be performed on area-averaged data. Averaging over gridboxes is then
performed prior to the time scales decomposition.

The trend component

Trends are often computed in the time domain, in which case they might be expressed,
for example, as a change of so many millimeters per month occurring per decade. The
common procedure of fitting a linear trend assumes that such a rate of change is constant
with time.

The map room takes a different approach, based on a simple conceptual model: Rather
than expressing local or regional trends as functions of time, we relate them instead to global temperature change. The assumption is not that precipitation (or temperature) changes simply
as a result of the passage of time, but rather, because of the warming of the planet.
It is in this sense that the trend component, as computed in the maproom, can be identified
with "climate change." Such a trend has a functional, rather than simply a numerical
significance.

Computation of the global temperature record to be used as a regressand is less simple
than it sounds. Fluctuations in the Earth's climate have many sources, including "natural"
variability — intrinsic variations that are not associated with anthropogenically-induced
climate change. Such variations, if large enough in scale, can significantly influence,
or "project" onto the global mean temperature. If we take the latter to represent
in some sense the signature of climate change, there is a risk that we will unintentionally
include some component of natural variability, which will then mistakenly be identified
with this signature.

To circumvent this problem the global temperature signal is computed using an ensemble
of general circulation models (GCMs). These models, which constitute a comprehensive
representation of our current understanding of the mechanisms of climate variability
and change, underlie much of the Fourth Assessment Report of the Intergovernmental
panel on Climate Change (IPCC, 2007). Simulations from the "Twentieth Century Climate
in Coupled Models" (20C3M) experiment are used.

As with the real Earth, climate in each of these simulations includes both a "forced"
response (what we think of a climate change) and natural, "unforced" variations. However,
the unforced variability is incoherent from model to model — there is no synchronization or phase relationship among the
models, and indeed, the character of each model's unforced variability differs to
a greater or lesser degree from that of the others. To obtain an estimate of the forced
response, we average together the 20th-century global mean temperature records from the members of this ensemble, which
here includes 23 (nearly all) of the IPCC GCMs. Averaging has the effect of attenuating
the incoherent (i.e., uncorrelated) unforced variability while enhancing that part
of the response that the models have in common — the climate change signal. The multimodel
averaging can thus be said to increase the signal-to-noise ratio, where "signal" refers
to the common climate change response and "noise" the unforced natural variability.
This ratio is increased in the multimodel mean relative to that of the individual
simulations. Most of the models provide multiple 20C3M simulations; to put the models
on an even footing, a single simulation from each model is used to create the multimodel
average.

The multimodel mean signal is further processed, by lowpass filtering. This has the
effect of removing most of the residual year-to-year and decade-to-decade variability
that has not been averaged away in the formation of the multimodel mean. The resulting
smoothed global temperature signal, which serves as the signature of the forced climate
change response, is shown in Fig. 1. Downward "bumps" in this signal in the 1900s,
1960s and early 1990s can probably be attributed, at least in part, to major volcanic
eruptions, which have a short-term cooling effect; to the extent that these variations
are expressed in regional signals they will be recognized as part of the forced response.
Although the forcing is not anthropogenic in this case, it is nevertheless considered
"external" as far as the maproom is concerned: Volcanic eruptions are not believed
to be associated, at least in any easily demonstrable way, with natural climate variability,
so it was deemed incorrect to treat them as such.

Figure 1: The global mean "climate change" temperature record used for detrending.

The trend component of a local temperature or precipitation signal is extracted by
regressing the local series on the global temperature signal of Fig 1. Fitted values
from the regression represent, by construction, that part of the regional signal which
is linearly dependent on global mean temperature. It is in this sense that the trend,
as here computed, may be thought of as the climate change component of the regional
signal.

It is worth noting that for a local or regional signal that is being
analyzed in the maproom, the entanglement of forced and natural
components is still possible. This is because, while the signal of
Fig. 1 has been effectively stripped of natural internal variability,
a real-world signal may still contain natural components that are
"masquerading" as trend. This might happen, for example, if some
natural mode of variability were to be increasing over a relatively
long time period, say the last 30 years of the 20th
century. In such a case this mode might motivate a similar increase in
values of the regional series being analyzed, which then "maps" onto
the global mean temperature increase shown in Fig. 1. The Atlantic
Multidecadal Oscillation (AMO, see, e.g., Enfield, 2001) exhibits a
signal something like this; to the extent that the AMO influences
local climate, there may be some possibility for this sort of
misidentification to occur (see, e.g., DelSole et al., 2011). In
general, and for the more approximative type of assessment for which
the maproom is designed, we do not believe that such entanglement will
pose a major problem with interpretation.

Figure 2a illustrates the detrending step, as applied to a typical
precipitation record obtained from the maproom. Note that the inferred
trend is negative, and appears as a shifted, scaled inversion of the
signal shown in Fig. 1. The inverse characteristic results from the
fitting of a downward-trending regional signal; the fact that the
inferred trend is a scaled, shifted version of the signal of Fig. 1 is
a characteristic of the linear regression. Recall, finally, that the
inferred trend represents a regression on global
mean temperature; this explains its nonlinearity in the time
domain.

Figure 2: Stages of the
maproom decomposition process. (a) Trend component, represented by
the fitted values in a regression of the local signal onto the
multimodel mean temperature record of Fig. 1; (b) Residual signal
from this regression and its lowpass-filtered counterpart, the latter
identified with the decadal component of variability; (c) Interannual
component, which is the residual signal in (b), from which the
decadal component has been subtracted.

Decadal and Interannual components

If the fitted values from the regression onto the global multimodel
mean temperature record of Fig. 1 are taken as the "climate change"
trend, the residuals from this regression then represent the natural,
unforced component of variability. The next step in the analysis aims
to decompose this residual signal into "decadal" and "interannual"
signals, representing respectively the low- and high-frequency
components of natural variability.

To do this, the residuals are lowpassed by filtering, using an
order-five Butterworth filter with half-power at a period of 10
year. Although the Butterworth design has some desirable properties
that make it well-suited to this task, any number of alternate
filtering procedures could also have been used; testing indicates that
results are not sensitive to the filter details. Filter parameters
were chosen (a) so as to effect a clean separation between low- and
high-frequency components without introducing instability in the
filter response (this refers to the filter order) and (b) to
effectively classify variability due to El Niño-Southern
Oscillation (ENSO) as "interannual." With the order-five filter,
covariation between the two components generally amounts to no more
than a few percent of the variance of the initial "raw" series. (In
the real world a "perfect" separation of time scales is not
achievable; all practical filter designs represent compromises in this
regard.)

ENSO exhibits a broad spectral peak in the 2-8 year band. Phenomena
responsible for variability on longer time scales belong to class of
processes that are less well-understood, and whose predictability is
currently the subject of active research (see, e.g., Meehl et
al. 2009). This "low-frequency" class includes large-scale modes such
as the Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal
Oscillation (AMO), as well as low-frequency stochastic
variations. Thus the filtering effectively partitions variability by
process class, not simply by nominal time scale.

This second stage in the decomposition in illustrated in Fig. 2b,
which shows in black the "natural" residual from the detrending
operation of Fig. 2a (i.e., the raw initial signal minus the trend
component). Superimposed on this is the turquoise "decadal" signal, which
represents the output of the lowpass filter, applied to the natural
residual.

Finally, the interannual component is computed as the difference
between black and turquoise traces in Fig. 2b, i.e., the residual from the
detrending step minus its lowpassed incarnation. Shown in Fig. 2c,
this signal represents that part of natural variability having its
expression at periods shorter than ten years. The trend (red), decadal
(turquoise) and interannual (blue) signals are what is shown in the
maproom when the user either clicks at a point or chooses "area
average," the latter in order to display the time scale decomposition
as applied to an area-averaged signal.

A few things to be aware of in using the maproom

As noted above, detrending via regression on the multimodel temperature record represents only an approximate
separation of forced and natural variability; rigorously performed, such separation
is not a simple exercise. Depending on timing and period, natural fluctuations in
the data could project onto the multimodel mean temperature signal and be incorrectly
identified with the forced response. The potential for such entanglement becomes lower
as length-of-record increases, and vice versa.

We recommend the analysis of area-averaged data if possible, for a number of reasons.
First, data at an individual gridpoint is likely to be noisy. For example, a couple of wet years near one terminus of the series could create
the impression of a trend, even though there is no overall trend in the series. Even
if the gridpoint passes the intermediate screening criteria a number of values may
have been filled, with potentially misleading effects. These problems will generally
be ameliorated to some degree by area-averaging, for the same reason that multimodel
averaging increases the signal-to-noise ratio of the resultant series: Anomalous events
will be less likely to occur in many locations at once, so their effects will be reduced.

The maproom shows the percent of variance in the raw data that is explained by each
of the three components. The alert user will notice that these percentages often sum
to less than one hundred percent. This is because the filtering procedure is necessarily
imperfect, and does not completely separate the decadal and interannual components
of the detrended series, leading to a degree of covariance between them. (This is
the filtering problem known as "leakage.") The value for this covariance is also provided.
Twice the covariance value, added to the trend, decadal and interannual variance percentages,
should add to one hundred percent, give or take a percent for rounding error.

After detrending, about 20% of the variance of annually-resolved white noise would be expected to
accrue to the decadal component, as here defined. White noise is a random process
having no "memory," in the sense that its value at a particular time does not exhibit
any dependence on its values at other previous times. This differs from processes
having memory or "persistence," in which the process level is dependent on previous
values (such processes tend to vary more slowly than white noise). Thus, a decadal
variance fraction of as much as 20% (meaning the decadal fraction divided by the sum
of decadal and interannual fractions) should not be mistaken for the signature of
a systematic decadal "oscillation," or even a slow random process, that differs from
white noise.

Although the variance decomposition, or other aspects of the maproom plots may provide
some sense of future expectations (in the statistical sense), the maproom is primarily
a means of deconstructing past variations, not a predictive tool. In particular, it
cannot tell us how the character of variability may change in response to global warming
and thus, how the decomposition of variance might evolve with the passage of time.

Dataset Documentation

Asian precipitation products created by the Research Institute for Humanity and Nature
and Meteorological Research Institute of Japan Meteorological Agency.
Data Source:
Asian precipitation from APHRODITE