Geology's Anthropocene Debate

Originally published in the Britannica Book of the Year. Presented as archival content. Learn more.

This article was originally published in the Britannica Book of the Year, an annual print publication that
provides an overview of the year’s most-notable people and events.
Unlike most articles on Britannica.com, Book of the Year articles are not reviewed and revised after their initial publication.
Rather, they are presented on the site as archival content, intended for historical reference only.

Though scientists had known for decades that humans exerted an enormous pull on Earth’s natural resources, geologists and other scholars pondered in 2015 whether that influence was so great that a new geologic interval, the Anthropocene, should be created. Much of the discussion focused on the advisability of adding a category to the international chronostratigraphic chart (the official chart of geologic time maintained by the International Commission on Stratigraphy [ICS]). If that change was made, geologists would also need to decide whether to add the Anthropocene as an epoch (on par with the Holocene Epoch [11,700 years ago to the present]) or as an age within the Holocene. In addition, the interval’s starting point would need to be set. The decision making was assigned to the Anthropocene Working Group (part of the ICS Subcommission on Quaternary Stratigraphy), which intended to present its findings and a recommendation to the ICS sometime in 2016.

Anthropocene EpochBy the end of 2015, many scientists were arguing that the collective weight of human activities on Earth had altered various natural cycles on the planet and that a new epoch of geologic time, the Anthropocene Epoch, was under way; however, the starting point of this new geologic interval was still a matter of debate.Encyclopædia Britannica, Inc.

The boundaries between different intervals in the chart of geologic time (e.g., ages, epochs, periods, and eras) represented sudden changes between or within different layers of rock. Nearly all of those changes were chemical in nature (such as the deposition of a new layer from volcanism or sedimentation); however, many were accompanied by changes in the types and concentrations of fossils. Geologists and paleontologists interpreted those sudden changes in the fossil record as evidence of ecological upheavals, after which new species emerged. Both phenomena were excellent indicators for the establishment of boundaries between one interval and another.

Some of those boundaries, most notably those between the Permian Period and the Triassic Period and between the Cretaceous Period and the Paleogene Period, recorded mass extinctions—that is, extinction episodes in which vast numbers of species perished over the course of only a few million years, dramatically reducing Earth’s overall biodiversity. Throughout Earth’s history mass extinctions were caused by natural forces, such as climate disruptions, changes in ocean chemistry, widespread volcanism, or the sudden impact of asteroids or comets. One of the main reasons that scientists were considering the idea of adding the Anthropocene was that a mass extinction episode is currently under way. That episode was different, however, because the activities of just one species, Homo sapiens, were causing it.

Humans were changing the planet in other ways as well, notably by continuing to influence what happened on Earth’s surface, in Earth’s atmosphere and oceans, and in biogeochemical nutrient cycling. Incontrovertible evidence of humanity’s footprint across the globe first appeared during the “Great Acceleration,” a boom period that followed World War II, which was characterized by exponential growth in the human population, fossil-fuel use, water use, food production, and international communication and the rapid pace of land-use conversion.

By 2015 humans had modified more than 50% of Earth’s ice-free land area, having turned much of it into farmland, pastureland, or urban land. The burning of fossil fuels (e.g., wood, coal, petroleum, and natural gas) to cook food, provide heat, and generate electrical power, as well as the measures used during the production of concrete for roads and buildings, caused the concentration of carbon dioxide (CO2) to rise in the atmosphere. (Atmospheric CO2 had been tracked directly since 1959, when the level stood at 316 parts per million by volume [ppmv]; by 2015 it had risen to 400 ppmv). Rising CO2 contributed to the increase in Earth’s average near-surface air temperature, and climatologists believed that rising temperatures contributed to numerous other changes, including the loss of large amounts of sea ice in the Arctic Ocean and of a number of ice shelves surrounding the Antarctic Peninsula, the reduction in the size of mountain glaciers, and the increased frequency of extreme weather events. Other chemicals, such as lead, sulfur compounds, and chlorofluorocarbons, as well as radioactive isotopes from nuclear testing, also affected the atmosphere and the living things that used it.

Earth’s oceans, which also served as a carbon sink, were being altered by human activity. Much of the CO2 in the atmosphere was absorbed chemically into the oceans, where the compound was slowly making ocean waters more acidic. Humans also changed Earth’s water resources by building dams and diverting water sources away from their natural courses, drawing groundwater from freshwater aquifers faster than it could be replenished and sharing in the creation of large oxygen-depleted areas near the mouths of rivers resulting from the runoff of fertilizers and organic waste products upstream.

Despite the large multifaceted influence humanity had on the planet, formalizing the Anthropocene in the international chronostratigraphic chart depended on the notion that the effects of humans on Earth were substantial enough to eventually appear in rock strata. Fossils of organisms that had died out from human activity were limited in rock strata, since sedimentation and other geologic processes often took millions of years to play out, so some other change in the rock, perhaps one resulting from a chemical change to the atmosphere or the oceans, might be a more-appropriate indicator. Ideally, geologists and paleontologists would like to identify a point in the rock profile (often the laying down of a new stratigraphic layer) that indicated the beginning of this worldwide human effect. Geologists would then mark the beginning of that layer with a “golden spike”—a Global Boundary Stratotype Section and Point (GSSP)—which denoted the official, internationally recognized boundary between the end of one interval and the beginning of another.

American biologist Eugene Stoermer, known for his work with diatoms (single-celled algal organisms), coined the term Anthropocene in the late 1980s, but Dutch Nobel Prize-winning chemist Paul Crutzen was credited with having introduced it into common parlance by mentioning it at a conference in 2000. It was not until 2008, however, that scientists actually suggested that the Anthropocene Epoch be included as a formal geologic interval.

Since that time interest in the possibility of making the Anthropocene official had extended beyond geology and other Earth sciences and into circles of scholars interested in anthropology, history, and the environmental sciences, as well as to the press and the public at large. Scientists and environmental writers noted that since its inception the word Anthropocene had come to mean something different to different groups of people. For some the word served as a term that embodied all of the world’s environmental problems, perhaps not unlike the way that the phrases “the ecological crisis” and “the environment” were used in the late 20th century to focus public and private energies and galvanize support throughout the world to solve looming environmental issues. Others worried, however, that some might use the word, because of its etymological similarity to the word anthropocentrism (the philosophy arguing that humans are the central or most-significant entities in the world), to justify the notion that humanity should continue to exploit the planet in any way it wished.

Nevertheless, the term and the construction of the geologic interval needed to make sense in geologic circles. While some geologists were developing plans on how to create an Anthropocene interval, critics wondered what benefit it would have to the field. After all, the Holocene epoch was made official only in 2008, with the idea that it was developed to consider the emergence of the first human societies, a significant conceptual overlap with the Anthropocene. Other critics pointed out that the Anthropocene would create problems with scholarship, because many studies in the scientific literature that examined Holocene environments and strata would need to be reclassified as having examined Anthropocene ones. Last, some geologists wondered whether creating a new interval now was too hasty, arguing that the real effects that humanity was having on the planet would not be fully known for hundreds of years.

If the Anthropocene was formalized in the future, the question remained: When would it begin? It could be the first geologic interval to start during the period of written human history. Most pro-Anthropocene scholars believed that the Anthropocene’s start date should coincide with one of several moments in which human activities began to guide the natural processes of the planet.

When Crutzen first discussed the idea of the Anthropocene, he had in mind as the start date the Industrial Revolution (specifically the event’s onset in Europe in 1784—the year that Scottish inventor James Watt is frequently credited with the invention of the steam engine). Later other Anthropocene starting points were proposed, including the extinction of large Pleistocene mammals, such as the mammoth, approximately 14,000 years ago (before the onset of the Holocene Epoch), the expansion of agriculture (some 5,000 years ago), and the expansion of mining (some 3,000 years ago).

By 2015 other starting points had risen in prominence. The first point, indicated by a worldwide drop in atmospheric carbon dioxide concentrations that coincided with the death of some 50 million indigenous people between 1570 and 1620, might have been brought on by the Columbian Exchange, a period in which goods, ideas, and people moved between the Old World and the New World. The second starting point, the period (1945–63) associated with aboveground nuclear weapons testing, appealed to many geologists, because residue from nuclear fallout produced during that time was spread across the globe. That such residue was not present in rock layers laid down prior to 1945 made that starting point a very definitive marker.

One benefit from the ongoing discussion was that it allowed people from several walks of life to examine the various impacts humans and their activities had had on the planet, perhaps providing additional opportunities to reevaluate nonsustainable lifestyles. The benefit to geology, the field that made the most use of the chronostratigraphic chart, was less clear, however.