The World Data Center (WDC)-A for Glaciology in Boulder, Colorado is one of three international data centers serving the discipline. WDC-A activities are under the general guidance of the National Academy of Sciences and the Panel on World Data Centers of the International Council of Scientific Unions (ICSU). The other two centers are WDC-B in Moscow, USSR, and WDC-C in Cambridge, England. The major purpose of these data centers is to facilitate the international exchange of data on all forms of snow and ice. The topics covered include: Avalanche Research, Freshwater Ice, Glacier Fluctuations and Mass Balance, Ground Ice or Permafrost, Paleoglaciology and Ice Core Data, Polar Ice Sheets, Sea Ice, and Seasonal Snow Cover. WDC is collocated with the National Snow and Ice Data Center (NSIDC) which was established by NOAA in 1982. WDC/NSIDC is operated by a contractual agreement through CIRES, University of Colorado, in association with the National Geophysical Data Center, National Oceanic and Atmospheric Administration (NOAA).

WDC/NSIDC serves as an archive for published material on snow and ice topics as well as digital and hardcopy data sets. It has an automated system for the storage and retrieval of bibliographic data and can provide copies of research papers and data on request. Computerized literature searches on topics relating to snow and ice are also prepared. The Center publishes Glaciological Data, a report series, generally twice a year. Issues typically focus on a single topic and include specialized bibliographies, inventories, and survey reports, invited articles on glaciological data sets, data collection and storage, methodology, terminology, and the publication of meeting and workshop proceedings. For example Glaciological Data Reports No. 1 (1977) and No. 16 (1984) contained avalanche bibliographies.

Snow Cover Data

Snow cover in the temperate regions of the world is one of the most important renewable resources. The use of snow for winter recreation, for irrigation and municipal water supplies, for the generation of hydroelectric power, and for soil moisture replenishment, is well established. Snow cover is an important component of the surface hydrologic cycle and of the climate system. The presence or absence of snow cover has great potential to modify atmospheric heating through wide variations in radiative, conductive, and latent heat fluxes. Due to its high albedo a snow- covered region absorbs only about one-sixth the solar energy compared to a snow-free surface and re-radiates at terrestrial wavelengths at a rate which typically exceeds non-snow covered areas. Any decrease of snow cover resulting from a warming trend results in increased absorption of solar radiation and additional heat to melt increased amounts of snow. This results in the classic albedo feedback mechanism which is included in nearly all global circulation and climate change models. In addition, the primary factor controlling how heat is consumed at the surface during spring and summer is the mass of the snow cover due to the large amount of latent heat required for phase change. Thus, snowcover exerts a significant influence on the heat balance of the earth and acts as a major source of thermal inertia within the total climate system as it takes in and releases large amounts of energy with little or no significant fluctuation in temperature.

Snow cover data are required over a wide variety of regions, time periods, and observation frequencies. The actual parameter may be snow extent, depth, or water equivalent. A wide variety of applications result in requests for numerous formats. The purpose of a request may range from applications in horticulture to manufacturing and marketing of snow removal equipment. The following are a few examples:

Duration: Dates of the first and last snowfall.
Dates of the beginning and end of continuous snow cover.

Frequency: Number of days with snowfall, or snow depth, greater than a specified amount.
Percent of time that the ground is snow covered.

Climatology: Mean/Maximum/Minimum for depth or water equivalent by day, week, month, year, or period of record.
Probability of snow on the ground, or probability of snowfall, for a given date.
Percent of time wet versus dry snow is present.

Fundamental Data Collection Techniques and Snow Research

In applied research the investigator typically utilizes what is known as the classical "scientific method" which includes four basic steps: 1) Identification of a problem, 2) Collection of relevant data, 3) Formulation of the hypothesis, 4) Testing of the hypothesis empirically by experiment. When it is known "a priori" just what constitutes "relevant data" it is possible to institute some amount of testing early in the investigation and to continue testing as data collection continues. However, this may not always be the case in the "real world". In the effort to establish relationships within complex natural systems, it is not always obvious which data are most important. While some general picture of cause and effect may be apparent, the quantitative physical process is not so clear. In other words, little, if anything, in the system is proportional to anything else, yet we know that everything is coupled to everything else.

On the global scale this type of dilemma is common. An example would be the attempt to characterize the possible relationship between sea surface temperatures in a tropical location and the precipitation patterns in a mid-latitude mountain area. However, an equally appropriate example of a complex system at the local scale is the avalanche phenomenon. Even where the physical explanations were lacking, the key to success has been the systematic, consistent collection of comprehensive and standard data sets. Intuitive approaches to avalanche forecasting have relied heavily on the collection of data regarding specific "contributory factors". Complex statistical and process-oriented models required even more precise data measured at frequent time intervals. Although a significant amount of numerical model development was undertaken during the 1970's, (Foehn et al., 1977; Buser et al., 1984) the resulting techniques rarely made it out of the academic or research communities to be used by the operational forecaster. This may have been caused by the fact that the results of the forecasts were presented as a probability value and the forecaster was often uncertain as to how this value should be applied. In addition, the forecaster may not always be comfortable with a methodology where he or she does not have a complete understanding of how the model makes use of the data in order to produce a forecast. In the 1980's, another forecast tool known as the "nearest-neighbor" method was developed (Buser, 1983). The basic approach of the nearest neighbor method parallels that used by a forecaster employing the conventional or intuitive method (LaChapelle, 1980). The attempt is made, now assisted by the computer's memory, to identify a selected set of days in the past which had similar weather and snowcover conditions to today and therefore could be expected to provide avalanche conditions similar to today. Since this process is similar to the human thought process and because no major assumptions are made in the method, the nearest neighbor approach has been generally better received by the operational community.

Of course, the success of statistical methods, including the nearest neighbor approach, depends greatly on the length of the data record. A nearest neighbor model would really only begin to be of significant value once at least 5 to 10 years of data were available for a given location. Therefore, locations which had consistently collected standard and comprehensive data stood to benefit from the nearest neighbor approach as soon as it was developed while other locations were required to wait until an adequate data set had been developed.

New Trends in Data Collection

Because of the increased focus on studies which strive to view the earth as a complete and linked system, the scale for studies in environmental sciences has been increasing in extent over the past few decades. Focus on point measurements meant to represent local study areas has been changing to a more regional to continental scale. The analogy in snow and avalanche studies is a familiar one with the development of regional scale avalanche information centers such as those in Colorado, Washington, Utah, and Montana. The more recent move to view processes on a hemispheric to global scale has not really had an analogy in avalanche studies specifically, but snow cover in general provides an obvious example.

In most cases, when a data collection area increases significantly in size, some degree of compromise is necessary. We simply cannot always obtain the necessary data at the ideal spatial resolution. When the area of study is reasonably small, on the order of 1.0 to 10.0 km2 for example, an initial over-sampling may provide the background information for the later application of a less dense operational data network. For example, on Blue Glacier in Washington state (LaChapelle, 1965; Armstrong, 1989) a network of only 12 to 21 ablation stakes were used for nearly three decades to monitor ablation for the entire glacier surface (5.3 km2). The extent to which this relatively sparse measurement network was representative of the true spatial variation in ablation was determined during several periods when a network with about ten times this density was installed.

As scales get even larger, this opportunity to oversample to estimate measurement errors is no longer practical. For example, a current project at NSIDC involves developing a technique for mapping global snowcover on a daily basis. The output for the mapping technique has a resolution of 40 km. While this is a rather coarse resolution, at a global coverage this represents about 115,000 grid points just for those Northern Hemisphere land surface areas where snowfall might be expected. The surface observations which provide input to the model are from the World Meteorological Organization (WMO). In total over the globe there are actually many fewer surface observations than there are grid points, about 5,500 for the Northern Hemisphere. In studies at this scale the opportunity for testing by oversampling simply is not available. As would be expected, surface station measurement density varies greatly over the globe with more frequent observations being made in those areas of highest population. Therefore, the technique used to interpolate from observation station to grid point depends on station density. For those areas of highest station density, at least one station within 50 km of the grid point, a technique of inverse distance weighting is used. In this case the influence of the reporting station diminishes with respect to the given grid point as both horizontal distance and elevation change. In regions where fewer than one station is found within 50 km, but at least one within 100 km, the snow model continues to use surface data but adds a regional scale snow depth climatology to provide a sort of guidance for the interpolation scheme. This method is known as kriging, or the application of regionalized variables (Oliver et al., 1989), and the climatology used is the digital version of the product produced by Foster and Davy (1988). In regions where station density drops below at least one in 100 km, the snow extent and depths are derived from satellite data using a method described later in this paper.

In global scale studies we rarely get all of the data we need at the necessary resolution and are thus forced to make the best of what is available. Various interpolation and extrapolation schemes are developed with all of the associated compromises and estimates of error. On a completely different spatial scale, we have experienced similar situations in the field of avalanche research. For example, we make detailed, comprehensive, and often overlapping measurements in carefully located level study sites while at the same time we measure little or nothing in a direct manner in the area of greatest interest, the avalanche starting zone. For example, during the winter of 1973-1974 the University of Colorado San Juan Avalanche Project (Armstrong et al., 1974) measured snowfall water equivalent, to the nearest millimeter, in the study site at Red Mountain Pass using five different methods ranging from conventional snow boards to isotopic radiation techniques. In contrast, we installed a few snow depth stakes in the avalanche starting zones which, when conditions permitted, were read from the highway below using spotting scopes to an accuracy of perhaps 10 cm at best. During storm conditions, however, we simply used the accumulated wisdom of conventional avalanche forecasting methods to "guesstimate" the amount and rate of precipitation in the starting zones.

We can find similar examples of the need to extrapolate to the true location of interest, or to the actual parameter we would like to measure. In one case, our specific desire was to learn something about differential creep rate on steep slope as a function of load, temperature, and crystal type. Unable to design the necessary instrumentation to actually make such measurements, we returned again to the convenience of the level study site. Here we employed an electronic settlement gage, used previously at the study site in Alta incidentally, and proceeded to, at least, advance the state of knowledge regarding densification and strain rates in a level snow cover (Armstrong, 1980)

Data Collection by Satellite Remote Sensing: The Interdependency of Scales.

Where environmental studies focus on hemispheric to global scale data to help analyze the interdependency and linked nature of earth systems, satellite remote sensing represents the primary data source. However, unlike the measurements described above in reference to avalanche studies, satellite data rarely correspond to the direct measurement of a specific environmental parameter. Satellite data represent values within some part of the electromagnetic spectrum, depending on the type of sensor, and the desired environmental information must then be derived from these values. For example, in the visible wavelengths (0.4-0.7 micrometers), the degree of surface reflectivity can be related to the presence or absence of snow cover (Dozier and Marks, 1987). Microwave data (wavelengths on the order of a few centimeters) are also being used to provide snow cover information (Rott, 1987). As noted above, we are not dealing initially with direct measurements of the parameter of interest, but we are measuring electromagnetic properties which are in some way related to physical properties.

For example, consider how this procedure works with microwave data. Microwave energy emitted from the surface of the earth is measured by a sensor on a polar-orbiting satellite. When snow covers the ground, some of the microwave energy is scattered by the snow grains and thus less energy is actually emitted from the snow surface. Because the amount of scattering is related to both the amount of snow and the specific wavelength, an algorithm can be developed which computes snow water equivalent, or assuming a density, snow depth. The current satellite carrying a passive microwave sensor is the Defense Meteorological Satellite Program (DMSP) Special Sensor Microwave Imager (SSM/I) which provides data in four separate channel frequencies, Weaver, et al., 1987). Passive microwave remote sensing of snow has several advantages over the use of visible wavelengths. Data are available day or night and during virtually all weather conditions. One major drawback with visible wavelength data is the need for clear sky conditions. Persistent cloud cover in winter, especially in mountainous areas, may limit data to only a few days per month. Coverage is global and frequent, at least daily for locations north of about 45 degrees north latitude, and once every two to three days for areas where snow might be expected south of 45 degrees.

Needless to say, there are disadvantages as well. The resolution is poor, more than 100,00 times more coarse than, for example, LANDSAT. For the microwave measurements the resolution depends on the frequency channel and is approximately 30 km for those channels used in the snow cover algorithms. The use of passive microwave is limited to dry snow. Once liquid water is present on the grains, the snow surface becomes an emitter and "looks", to the sensor, like bare ground. This dramatic change in signal can be used to monitor the onset of melt on a regional scale, but as long as the snow is wet, its location and water equivalent cannot be determined. If the snow refreezes, it again become "visible" to the microwave sensor. Another problem results from the fact there may be objects which are primarily emitters sticking out of the snow cover, such as dense coniferous forests or bare rock outcrops.

Finally, there are complications which result from the snow structure itself. The amount of microwave scattering is dependent on the size of the grains, the larger the grains the greater the scattering. In theory, in order to make any practical use of a given algorithm one would have to know the snow grain size to within about 0.1 mm. This of course would not be possible in any practical application. However, when algorithms have been developed and tested the results do not show this degree of sensitivity. For example, an algorithm which uses one assumed grain size provides reasonable results on the regional scale (Chang et al., 1987). Apparently the shapes of the natural snow grains are not as sensitive to scattering as the spheres used in the radiation transfer theory. In addition, it appears that only the mean grain size may be required, and not the layer by layer structure of the snow cover which again would be impossible to provide on a regional basis. The mean grain size does appear to have some significance because the standard deviation is typically small, usually less than the mean size. Although not as sensitive as theory might indicate, the snow algorithms do eventually begin to produce unreasonable values when the mean grain size of the snow cover exceeds about 2.0 mm in diameter (Hall et al., 1986). Here now is a situation where a global scale snow monitoring methodology is strongly affected by processes at the micro-scale. Our knowledge of snow metamorphism developed over the past several decades proves to be very useful in terms of this problem. We know that in sub-freezing snow layers there is really only one possibility for the grains to grow larger than about 1.0 mm in diameter, and that is through the process of temperature-gradient or kinetic metamorphism (Colbeck, 1987).

Preliminary analysis appears to indicate that as long as the average grain size remains below about 1.0 to 2.0 mm, regional scale algorithms will produce reasonable results. This does not mean that there cannot be grains larger than this limit in the snow cover, just that the mean size must be less. For the average grain size to exceed 2.0 mm, temperature gradient metamorphism must dominate throughout the snow cover. In general this can occur under two types of climatic conditions. In one case, the typical winter snow cover may be shallow (less than approximately 0.5-1.0 m) combined with mean monthly temperatures consistently less than -10.0C, causing the development of depth hoar grains throughout the snow cover. An example of such a geographic location would be the interior of Alaska. The second general case occurs in a snow climate where mean monthly air temperatures are approximately -10.0 C, plus or minus a few degrees, but the snow cover typically reaches a depth exceeding 0.5-1.0 m by about mid-winter, thus preventing the temperature gradients necessary to cause depth hoar to form throughout the snow cover. However, when drought years occur in these locations, snow depths often do not exceed the 0.5-1.0 m range, and in the presence of average air temperatures the gradients are sufficient to convert the entire snow thickness to depth hoar. For example, this condition occurred in many regions of the Rocky Mountains during the winters of 1976-77 and 1980-81.

Using a relationship between average snow depth for a given region and the potential to develop a snow cover which would primarily be composed of depth hoar grains larger than 2.0 mm, we are beginning to develop a method to adjust the microwave algorithms for grain size (Armstrong, 1990). The next step in refining this relationship will be to include regional scale temperature data. The example of spatial scales now comes full circle because the success of the above research in global-scale remote sensing will depend heavily on the extent to which the necessary snow structure data have been systematically collected and archived for various snow climate zones.

Armstrong, R.L., LaChapelle, E.R., Bovis, M.J., and Ives J.D. 1974. Development of methodology for evaluation and prediction of avalanche hazard in the San Juan Mountain area of southwestern Colorado. Occasional Paper No. 13, Institute of Arctic and Alpine Research, University of Colorado, Boulder: 141 p.