The geologic record goes back billions of years – 4.54 billion, to be exact – to the oldest minerals on Earth. It shows the evolution of life, the assembly and break-up of supercontinents, and changes in climate. It tells stories of mountains building and slowly eroding away, meteor impacts, mass extinctions, and miraculous survivals.
In the context of this record, humans essentially came onto the scene what feels like yesterday. We’re a single species, evolved over the past few million years. How much influence could that really exert, anyway?
Other than, perhaps, cyanobacteria (responsible for the rise of oxygen in the atmosphere), humans are responsible for the most dramatic species-driven alterations to Earth in its history. We dominate our environment so strongly that some scientists began using the term Anthropocene to describe our current geologic epoch: the age of humans.
Popularized in 2000 by Nobel laureate Paul Crutzen and Eugene Stoermer, this term has become widely used in the media and, informally, in research literature. While it is popular in discussions of climate change and the role humans have in exacerbating (or alleviating) its effects, scientists still don’t agree on whether or not the Anthropocene as an official unit of geologic time should exist. Geologic time is vast, the argument against it goes, and humans will likely just be a blip; we’re just egomaniacs. (An argument that I, a few years ago, agreed with.) According to this argument, the changes we’re making, while extensive, have been too rapid to show up significantly in miles-thick piles of sediments. The odds that anyone would find, and accurately interpret, our signatures in the dust are slim to none.
But others disagree.
While it is not yet an official geologic epoch, progress towards that goal is underway. Groups of scientists and science communicators around the world have been lobbying for its existence for several years; in May of this year, the Anthropocene’s strongest proponents, the Anthropocene Working Group (AWG), voted to craft a formal proposal to add the epoch to the geologic timeline. Having a term specifically for this period of rapid, human-driven environmental change could help scientists convey the massive impact of this change (and the urgency with which it should be addressed) to a broad audience, and in today’s media and political climate, we need all the tools in our belt we can carry.
But even proponents of the Anthropocene don’t agree on when exactly it should start.
Despite our relatively recent arrival, humans as a species have not only littered the geologic record with evidence of our existence, but we’ve irreversibly changed the face of the planet and, potentially, the course of its cyclicity. The planet is warming at an accelerated rate due to anthropogenic injections of carbon dioxide, methane, and other greenhouse gases into the atmosphere; as a result, the ice caps are rapidly melting, extreme weather events are becoming more frequent and severe, and forest fires are more intense. Climate change is altering the habitable ranges for vast numbers of plant and animal species, the chemistry of the ocean is changing (becoming more acidic and less oxygenated), and entire biomes and ecosystems are under great stress.
Humans have also physically altered the Earth’s surface dramatically. Mines scar the surface (estimated at over 8 million acres disturbed in the United States alone), oil and natural gas pumps dot landscapes around the world, and landfills fill up, covered by a thin skin of soil, venting greenhouse gases into the atmosphere. Plastics permeate both terrestrial and aquatic ecosystems. Our infrastructure reaches essentially every corner of the globe, with urban megacenters to accommodate an ever-growing population.
But it’s not likely that our cities and highways will mark our presence in the rocks. Beyond what we can easily see, there are other tracers of human activity. A suite of isotopic records will reflect massive perturbations in not only the atmosphere (like changing levels of carbon dioxide), but also in radioactivity (from atomic bombs and nuclear plants), droughts, and ocean dead zones. Extinction and ecological turnover rates have already sped up – 32 percent of sampled vertebrate species are decreasing in population, almost 30 percent of birds in North America have been lost since 1970, and up to a million species are at risk of extinction. In fact, these species losses and dangers to biodiversity have been so dramatic that it’s been termed the “sixth mass extinction.” A recent study in Science maps just how these changes in biodiversity are happening. Plastics – a distinctly human signature – are being deposited everywhere, from equatorial oceans and polluted riverways to remote alpine meadows and Arctic ice.
Although some of these items may feel abstract, they are the sort of thing that geologists see in the rock record. Carbon, oxygen, and other isotopes have long been used to infer changes in climate and primary productivity. Everything we do is being written down in a book that will legible for millennia, if not longer. We are telling our own story now, not by words, but by our actions. The geologic record will always be recording, whether or not we want it to.
Each of these potential records must be taken into account in the debate over the start of the Anthropocene. The “golden spike” — the marker of our entrance into this new geological age — has been proposed to be as recent as 1945, when nuclear testing began. Other proposed start dates include 1784, when the steam engine was refined or 1800, around the beginning of the Industrial Revolution. Still others proposed that the start date should be 5,000 to 8,000 years ago, when early agriculture began reshaping landscapes and changing carbon cycling – or even as far back as 10,000 years ago, when early plant and animal domestication occurred. While using the 1945 radiocarbon spike from nuclear testing is tempting because it would be relatively easy to locate in the rock record, is that really when human influence on Earth’s surface and processes began?
The transition from a nomadic, hunter-gatherer lifestyle to cultivating food set the stage for human civilization today. Crops and livestock were domesticated, allowing more food to be produced in less space, which increased the stress humans placed on their soil and local ecosystems. As this lifestyle spread, other changes to civilization occurred. Populations became more centralized, trade was established, languages – and seeds – shared. The world, slowly, began to be more connected. And as populations grew and land use increased, with more fields being plowed and more forests cut down, more mineral resources mined and more water diverted and polluted, so our signal grew.
Greenhouse gases and climate change are the focal point of many arguments around the Anthropocene; it’s why the refinement of the steam engine at the start of the Industrial Revolution was originally proposed as the starting point of the epoch. While the mid-1800s are when emissions began to grow rapidly, we were disturbing the carbon cycle long before that. Churning up and reusing soil for monoculture lowers its ability to store carbon, and deforesting large swaths of land does the same. And while ancient civilizations were likely no match for today’s emissions-belching industrial agricultural complex, they did their fair share of damage too. In 2018, for instance, LiDAR imagery revealed previously unknown extents of Mayan civilization that would have required clearing forests (likely through slash-and-burn) as well as changing topography, as we do for landscaping today. Although these were smaller disturbances, we’re still responsible, and they were still a stepping-stone to our practices today.
Today we live, for the most part, in stable continental interiors. Unlike the seafloor, which gets recycled every 180 million years or so, crustal rocks in the center of a continent can hang around for hundreds of millions, even billions, of years. In other words, New York City is not going to get subducted any time soon, and neither is Nebraska with its cornfields or Wyoming with its grazing cattle. So in places such as those, we can count on our activities being preserved at least in the short term. In geologic time, though, the chemical records of our existence are more likely to be found than physical remnants of human civilization. Carbon isotopes will tell tales of greenhouse gases and a warming world, ocean sediments will reflect water with less oxygen and higher acidity, and the paleontological record will show just how quickly biodiversity dropped after we came on the scene.
In this ongoing discussion, we have to ask ourselves: Why do we care whether or not there’s an Anthropocene? Why bother with this at all? Polar bears don’t care about what we call the age we’re living in now. They’d just like their ice floes to stop melting, thanks. But it is a concise and convenient way to express how impactful, and harmful, humans can be on the planet. If the point is, at least in part, to call attention to the responsibility we have as a species, to raise awareness of the consequences of our actions, we have to be comprehensive. We can’t ignore our origins. To make the term “Anthropocene” have the greatest impact, then, we should extend the age of humans back to when the lifestyle changes occurred that led to our civilization today – the dawn of agriculture and the irreversible changes it brought to our society, and our world.