America's endless apocalypse

Over the last decade, we've become obsessed with the end of the world -- and it's hurting us all

Published February 26, 2012 7:00PM (EST)

    (<a href='http://www.shutterstock.com/gallery-587221p1.html'>file404</a> via <a href='http://www.shutterstock.com/'>Shutterstock</a>)
(file404 via Shutterstock)

This article is an adapted excerpt from the upcoming book "The Last Myth," available in March from Prometheus Books.

Few of us can clearly make out the form of history until it is well behind us. That helps us understand why we still have not settled on a common name for the decade now receding behind us. How others will look back on this time is beyond our knowing or influence, of course, but future historians would do well to ascribe to our time a name that encapsulates not just the events of the past decade but the way in which we as Americans have come to view the world and our place within it. Such a name might be the Apocalyptic Decade or, perhaps, the Apocalyptic Era — for it is not over yet.

It was during the last decade, after all, that the belief in the end of the world leapt from the cultish into the mainstream of American society. Ours is an era bookended by the widespread belief in the impending collapse of society: at one end we had Y2K, the largest and most expensive mass preparation for a secular apocalypse in the history of the world; at the other end we have the growing expectation and belief that December 21, 2012 — the supposed end date of the ancient Mayan Long Count calendar — will herald either a radically transformative or utterly cataclysmic global event. Between these two bookends are pages upon pages of apocalyptic anxiety, a decade-plus-long collection that tells the tale of an America that has grown very afraid of the future.

American history is filled with extended periods when the nation looked at the challenges it faced not with despair but with optimism; times when we felt with certainty that the country was not only on the right track but was traveling with great speed toward what would surely be a glorious destination. Such periods are not relegated solely to the distant past. Indeed, though we often forget it, we experienced an extended period of such optimism a little more than twenty years ago, when the Cold War came to an end.

The sense of historical emancipation felt during the early 1990s is best exemplified by American philosopher and political economist Francis Fukuyama’s influential essay “The End of History?” Fukuyama borrowed his title from Karl Marx (who had, in turn, borrowed it from Hegel ) to argue that the end of the Cold War signaled the triumphant end to human progress. “What we may be witnessing,” Fukuyama wrote of the current events swirling around him, as the Berlin Wall fell and Eastern Europe broke free of Soviet influence, “is not just the end of the Cold War or the passing of a particular period of post-war history, but the end of history.” Like Hegel and Marx before him, Fukuyama viewed history as a struggle between competing ideologies—in Fukuyama’s case, between Western liberal democracy and the authoritarianism of fascism and communism. The collapse of the Soviet Union, in Fukuyama’s mind, marked “the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.” With the fall of the Berlin Wall, the long struggle of history had finally come to an end: democracy and capitalism, empowered by rapid advances in technology, had won the global battle for the hearts and minds (not to mention the pocketbooks) of humanity.

It’s important to remember the earnestness of the delusion that we had escaped history—a delusion that spread from Washington to the NASDAQ to the Top 40 charts—for it represents the starting point in our bipolar shift in consciousness toward apocalyptic despair. For some, Y2K came to be seen as little more than the latest example in a long series of failed apocalyptic prophesies, indistinguishable from the predictions of Nostradamus or the end-of-the-world expectations of numerous cults and sects throughout the ages. And for many others, the whole thing came to be seen as a hoax — a con game conceived by greedy computer geeks eager to cash in on a nation’s technological illiteracy.

Rather than a crisis that failed to materialize or a crisis that never existed, however, Y2K was in fact a crisis successfully averted. Of course, the media hype of what might have happened was overblown — there was little chance of your toaster going crazy, after all. Yet most IT experts today view the government and corporate investment in repairing the Y2K problem as both necessary and beneficial. Thus, instead of viewing Y2K as a failed prophecy, a more accurate comparison for Y2K is the global response to the discovery of the hole in the ozone layer above Antarctica in 1987. In both cases, the world came together and made the investments and changes necessary to avert disaster. Yet for many Americans, our success in averting disaster has become evidence that the disaster was overhyped or nonexistent from the start.

Dismissing Y2K as hype, hoax, or failed prophecy is to draw the wrong lessons from the seminal moment of the Apocalyptic Decade — a moment that has a great deal to tell us about the era we are now living in. Rather than an atypical moment in our recent history — an uncharacteristic indiscretion on the part of an otherwise rational nation — Y2K was an archetypical one, setting the template for how we would perceive and react to the gathering crises of the twenty-first century.

Our reaction to Y2K demonstrated the profound ability of apocalyptic thinking to adapt to the challenges of the new millennium. Confined to the religious arena for nearly two thousand years, the apocalypse had (understandably) loomed large in the secular mind with the advent of the nuclear bomb and rising Cold War fears of atomic annihilation. If the end of the Cold War had convinced some that history had come to a peaceable end and that apocalyptic anticipation would recede into the various evangelical denominations from which it had sprung, however, Y2K proved otherwise. The optimism exemplified by Fukuyama at the beginning of the 1990s had been quickly replaced by a deep distrust in the stability of the new Golden Age. It was one thing to intertwine apocalyptic fear and rhetoric with the fear of nuclear holocaust; it was another thing altogether to witness the apocalyptic archetype interject itself into concerns over a few lines of programming in our computers. If the apocalypse could find its way into binary code, then it could — and would — find its way into our interpretation of almost any challenge before us. The apocalypse, as Y2K demonstrated, had become deeply rooted in the secular American mind.

For some, the lack of drama or disaster that accompanied Y2K justified placing most discussions of Armageddon on the yonder side of the grassy knoll, in tinfoil-hat territory. This line of thinking has proven disastrous to efforts to address numerous pressing issues — global warming being chief among them. Yet for many others, Y2K turned obsessing about the apocalypse into a national pastime; by 2001, the expectation that a major event could lead to the rapid unraveling of modern society had moved firmly from the realm of the conspiracy into the suburban American living room, where it has stayed ever since.

Fewer than twenty-one months after Y2K, the language of apocalypse would be deployed by President Bush to provide the context and explanation for the attacks of 9/11, pushing the apocalyptic even further into the center of public debate—and sending the Rapture Index soaring.

Perhaps fittingly, President Bush first began to use this new mythic language in his remarks at the National Cathedral in Washington, DC, during the National Day of Prayer and Remembrance on September 14. No longer was it sufficient to merely hunt down and punish those responsible for the attacks, the president declared, standing before the altar: “Our responsibility to history is ... clear: to answer these attacks and rid the world of evil.” A tall undertaking, to be sure, especially for a nation grown soporific on the luxuries of consumerism and the distractions of the Internet. But there we were: after one hundred thousand years of human history, it fell upon us (what poor luck!) to destroy the very duality that has defined the human condition for all of time. Fukuyama needn’t fret about wasting his life away on video games and martinis at the end of history anymore; instead, the end of history had brought the final grand struggle that had been anticipated by Judeo-Christian thinkers for more than two thousand years.

Bush continued to push ever further the simple and powerful metaphor of apocalyptic showdown to explain the public outbreak of hostilities between the United States and radical Islamists. Regardless of whether one supported or opposed the president’s response to 9/11, it’s important to note that his framing intrinsically relied on the shared Christian cultural understanding of the apocalypse. Imagine if President Bush had stood up and delivered the same speech that he delivered to the National Cathedral to a nation of people that recognizes the struggle between good and evil as a constant and even desirable characteristic of the world. Without the deep cultural archetype of the apocalypse, and its promise of a final showdown in which good triumphs over evil, the president’s framing of the war on terror would, quite literally, have made no sense.

Thus, interest in the Rapture Index became a yardstick for measuring apocalypsy itself. Time magazine featured the Rapture Index prominently in its cover story on the end times in June of 2002; its popularity was duly noted in the New York Times, the Washington Post, the Wall Street Journal, and in numerous magazine articles and television reports. By 2006, according to the Los Angeles Times, “40% of Americans believe[d] that a sequence of events presaging the end times [was] already under way.” Increasingly, Americans turned to the apocalypse as a means of understanding current events—a prism that the media reflected back on the populace. By the summer of 2006, when war broke out between Israel and Hezbollah guerillas in Lebanon, the media’s obsession with the end times had morphed from mere (and justifiable) reporting on what fundamentalist Christians believed into a central and recurrent lens through which the media itself interpreted the conflict. In a column for USA Today, Gannet News writer Chuck Raasch declared: “You can’t look at the headlines these days . . . and not conjure up apocalyptic visions.”

In the spring of 2002, Americans watched reports of the rapid collapse of the Larsen B Ice Shelf in Antarctica. In just thirty-one days, a section of the shelf as large as Rhode Island collapsed into the sea. Barely eighteen months later, reports arrived from the other end of the globe that the Ward Hunt Ice Shelf on the northern coast of Canada, the largest ice shelf in the Arctic, had literally split in two for the first time in at least three thousand years. Each successive spring and summer during the Apocalyptic Decade brought more urgent warnings and vivid evidence of climate change from the extreme north and south — and from points between. Thirty-five thousand Europeans died from a massive heat wave that struck the continent in the summer of 2003; the highest temperatures on record hit dozens of American cities in the summer of 2004, while wildfires raged throughout the Southwest; 2005 was punctuated by the “super-hurricane” season, with Katrina, Rita, and Wilma wreaking unprecedented devastation on the Gulf Coast.

In the days following Katrina’s landfall and the breaching of the levees separating New Orleans from Lake Pontchartrain, Americans watched a city descend into chaos. The “thin veneer of civilization” (to use, as many did at the time, Edgar Rice Burroughs’s phrase) disintegrated before our very eyes. Katrina invoked a sense of helplessness in the face of forces beyond our control. We saw public officials weeping on television and threatening to physically attack the president for his inaction, news anchors and reporters visibly disgusted and traumatized on live TV, and an endless stream of images of bodies strewn throughout the flooded city. Many compared the collective experience that we shared through television and the Internet to 9/11, but as traumatized as we were in the days following 9/11, there was a sense that the American government was on top of things, that all available resources were being marshaled to come to the aid of the survivors. In the weeks following 9/11, Americans’ faith in government (and the president) soared. With Katrina, both plummeted.

From abroad as well, the images of an unstable world came beaming home: commuter trains blown asunder in Madrid, a double-decker bus peeled open like a sardine can by bombs in London, beheadings and chaos in Iraq, and devastating floods and crippling droughts in Australia and around the world. Solace was rarely found in news closer to home, where the images of orange-jumpsuited prisoners kneeling in the strange sunlit purgatory at Guantanamo Bay seemed emblematic of the trapped helplessness of current events. We watched as the price of energy soared, with gasoline tripling in price from the beginning of the decade, and oil surging to a record high of $147 a barrel in July of 2008. Just months later, Americans awoke to the news that the foundation of our financial system was built on the shifting sands of overleveraged and insolvent banks. Our political system seemed broken, no matter who was elected, and the entrenched partisanship of the decade turned increasing numbers of Americans away from politics. Hope evaporated, like any other campaign promise; the American Dream had turned into a mirage.

Recently, a friend of ours who is in his sixties, observing the great upheaval of the last few years, said to us: “My only regret is that I won’t be around thirty years from now to see how it all turns out.” This is an understandable desire, felt by many of us when we contemplate the outcome of history as the end of our lives approach. Yet by taking a step back, we recognize the hopelessness of that desire. History ebbs and flows on a scale that dwarfs us. Resolution and permanence are a trick of perception, as when, at the high and low of the tide, it seems as though the sea has stopped moving. Such moments can trick us into believing that the world is in stasis, or that everything depends on our actions on the beach. Yet the sea never stops moving. The rhetoric of the apocalypse gets it backward: this is not the most important time to be alive—being alive is the most important time. The world before us will still be marked by laughter and love and art and joy ; a life is no less valuable or beloved if one lives in an age of decline, when the tides are running out, than in an age of progress.

When we free ourselves from the hypnotic spell of apocalypse, when we let go of our desire to see how things will turn out, we are free to answer a more important question. Not, are my beliefs correct ? But, how do I live in accord with my values right now ? Our insistence that a new world is coming later is a delusion; it is already here. We have met many who say that they will go start an organic farm when things come undone. We have met others who are already farming and say that they are doing it to prepare for the Great Unraveling. Why not choose to farm, as one example, because you value independence, self-sufficiency, and the environment and want to live in accordance with your values, rather than framing your life through the prism of the apocalypse, hoping to be proven right and others proven wrong ? The answer as to how to live into our values is different for each of us — it may be about traveling the world as much as manning the ramparts. But the right public policy prescriptions and personal decisions will come only when we abandon our expectations that some future cataclysmic moment will eventually prove us right.

From "The Last Myth: What the Rise of Apocalyptic Thinking Tells Us About America" (Prometheus Books, 2012). Reprinted by permission of the publisher.


By Matthew Barrett Gross

Matthew Barrett Gross is the editor of the Glen Canyon Reader and is a media strategist who has worked for Howard Dean's groundbreaking 2004 presidential campaign and Jon Tester's successful campaign for U.S. Senate in Montana.

MORE FROM Matthew Barrett Gross

By Mel Gilles

Mel Gilles is a writer and a former advocate for victims of domestic abuse. Her essay, "The Politics of Victimization," went viral in 2004, reaching more than 2 million readers.

MORE FROM Mel Gilles


Related Topics ------------------------------------------

Environment History