There is a growing sense, among young people as well as older demographics, that humanity is in deep trouble, and that extracting ourselves from our current predicament is going to take a concerted effort unlike anything our species has attempted in recorded history — if it's possible at all. The IPCC recently issued a "code red for humanity" about climate change, as the UN Secretary-General António Guterres put it, adding that "the alarm bells are deafening." Last year was the second warmest on record, and of the warmest 21 years, 20 have occurred since 2000. Right now, there are 416.96 parts per million (ppm) of carbon dioxide in the atmosphere, an increase of about 100 ppm since just 1960. As a point of reference, our Homo ancestors lived and evolved for some 2.5 million years with ambient concentrations averaging about 250 ppm.
Advocates of the so-called "New Optimism," such as Steven Pinker, like to quote Bill Clinton's exhortation to "follow the trend lines, not the headlines." Right now, the headlines are overflowing with almost unbearably bad news, such as the bungled U.S. withdrawal from Afghanistan, which has created heartbreaking scenes of human beings desperately clinging to the outside of passenger boarding bridges, and terrifying videos of people falling from airplanes — including a 17-year-old soccer star named Zaki Anwari. Yet this is, as most historically literate people know, the most recent culmination of many decades of failed Western-imperialist interventions in the region, including our misguided involvement in the Soviet-Afghan war, during which the U.S. backed the anti-communist Mujahideen rebels, one of whom was a Salafi jihadist known as Osama bin Laden. In 1988, bin Laden founded al-Qaida (meaning "The Base"), an apocalyptic group that was of course responsible for the catastrophic terrorist attacks of 9/11, which provided the pretext for subsequent U.S. invasions of Afghanistan and, later, Iraq (which had nothing to do with 9/11, nor was it manufacturing weapons of mass destruction).
So, what do the trend lines show? The rapid takeover of Afghanistan from the Taliban is fueling worries that al-Qaida could make a comeback, which once again raises the prospect of terrorist attacks against the U.S. Although al-Qaida is weaker than before, it is worth recalling that the core membership of the group in 2002 was only around 170 people. But this time around the inventory of political grievances that drive Islamist terrorism has grown, thanks to the U.S.-led pre-emptive invasion of Iraq, the indefinite detention of "detainees" in Guantánamo Bay, the use of torture as "enhanced interrogation" and so on.
One heard the slogan "never forget" from Americans ad nauseam after 9/11, but the cultural memory of peoples in the Middle East is far more robust than ours. Consider "The Management of Savagery," an influential jihadist manual published in 2004, which foregrounds a number of past foreign policy missteps by the West, including the Sykes-Picot Agreement that carved up the Middle East after the Ottoman Empire collapsed in the aftermath of World War I. As the now-deceased "caliph" of ISIS, Abu Bakr al-Baghdadi, declared in 2014, "this blessed advance will not stop until we hit the last nail in the coffin of the Sykes-Picot conspiracy." If events from the World War I era have prominently driven extremism over the years, imagine how long the atrocities committed by Western forces since 2001 will continue to motivate actions and recruitment in the future. To borrow an insight from Robert Pape at the University of Chicago, terrorism is a demand-driven rather than supply-limited phenomenon.
Add to this the fact that emerging technologies, most notably synthetic biology, cyber-technologies and artificial intelligence, will place unprecedented destructive power in the hands of non-state actors — e.g., terrorists — meaning that the next 9/11 could claim far more victims. In fact, ISIS — which grew out of al-Qaida in Iraq and espoused an even more violently apocalyptic ideology — explicitly fantasized about weaponizing the bubonic plague. As an ISIS member who was educated in physics and chemistry wrote in a document obtained by the U.S., "the advantages of biological weapons is the low cost and high rate of casualties."
The growing power and accessibility of so-called "dual-use" technologies (those that can be used to benefit humanity or inflict terrible harms) is one of the main reasons global catastrophic risk scholars believe the threat of civilizational collapse and even human extinction is higher today than ever before. As Lord Martin Rees famously speculated in his 2003 book "Our Final Hour," civilization has no better than a 50/50 chance of surviving this century. Three years earlier, the co-founder of Sun Microsystems, Bill Joy, compellingly argued in a much-discussed Wired article that, because of technology,
it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.
The intersection of these historical and technological trend lines — of religio-political terrorism and the democratization of science and technology — do not bode well for the future. Add to this the fact that climate change may have played an integral role in the creation of ISIS (by causing record-breaking droughts in Syria that fueled the Syrian civil war, which spawned the organization), and one wonders whether the mayhem since late 2001 might be a mere preview of what's to come.
Meanwhile, in the U.S., the political backdrop of these developments is an unraveling of the fabric of democracy itself. A sizable portion of the Republican Party has backed the "Big Lie" that Donald Trump won the 2020 presidential election, which led to the murderous Jan. 6 insurrection at the U.S. Capitol in an attempt to prevent the peaceful transfer of power to Joe Biden. As Ray Roseberry, a Trump supporter who recently parked his truck next to the Library of Congress claiming to have a bomb, said during a Facebook livestream: "You have two options here, Joe [Biden]: you shoot me, [and] there's two and a half blocks going with me. You're talking about a revolution, the revolution is on, it's here, it's today."
Republicans are painfully aware that their chances of winning the presidency are slim — the Democrats, for example, have won the popular vote in seven of the last eight presidential elections — and consequently party leaders are taking flagrantly authoritarian steps to enable state legislatures to overturn the will of the people in 2024. Election law expert and University of California professor Richard Hasen recently described this as extremely alarming, stating on CNN's "New Day": "I never expected to say I'd be scared shitless on CNN, but that's how I feel."
What do these trend lines indicate? Internationally, the world has slid into a "democratic recession," with more than half of countries in the Economist Intelligence Unit's Democracy Index of 2018 having seen their scores decline. According to Hoover Institution political scientist Larry Diamond, who coined the term, the COVID-19 pandemic has further exacerbated the trend. Within the U.S., anti-democratic sentiment appears to be nearly unstoppable, as most of the Republican Party has hitched its political future to the former occupant of the Oval Office. Populism remains a defining feature of the political landscape, and the political right has wholeheartedly embraced what might best be described as a zeitgeist of radical anti-intellectualism. A case in point is the increasingly raucous outbursts, some caught on video, over basic public health measures like wearing a mask to prevent the spread of COVID-19, which has been doused with fuel by right-wing pundits spouting conspiracy theories and anti-science rhetoric like Tucker Carlson, Sean Hannity and Phil Valentine, the last of whom recently died of the disease.
One also finds this zeitgeist behind the moral panic surrounding "woke" culture and, more specifically, critical race theory (CRT), which few people know the first thing about. The conservative commentator Mark Levin, for example, has repeatedly argued on camera and in his best-selling book "American Marxism" that this originated from the "Franklin School," (he means "Frankfurt School") based in Berlin (it was based in, well, Frankfurt). Does anyone care about such inaccuracies? No, because the political right has successfully established an epistemological regime, so to speak, in which experts, scientists, scholars and anyone else with genuine knowledge cannot be trusted. Our universities are infected by "postmodern neo-Marxism" (an oxymoron), Dr. Anthony Fauci is an "enemy to our nation," and members of the so-called "mainstream media" are, in Trump's words, "dopes," "scum" and "animals."
Reinforcing such dangerous delusions are social media platforms like Facebook and Twitter that enable "fake news" and "alternative facts" to propagate like the common cold, infecting the minds of informationally illiterate consumers — mostly older people sympathetic with right-wing views. As the computer scientist Jaron Lanier opines in "The Social Dilemma," social media has become one of the most pressing existential threats to democracy today. "If we go down the status quo for, let's say, another 20 years," he says, "we probably destroy our civilization through willful ignorance."
Unfortunately, this situation is likely to get worse in the coming years due to AI-enabled phenomena like "deepfakes." Imagine the difficulty of discerning truth from lies when a society, already in disarray from a broken educational system, is bombarded with high-resolutions, hyper-realistic deepfake audio-visual recordings designed to sow further chaos, distrust and enmity between rival factions. Who will any of us trust when our senses themselves become unreliable sources of information? Already, Trump has convinced his followers not to trust their eyes and ears: "What you're seeing and what you're reading," he said in 2018, "is not what's happening." But what happens when this becomes reasonable advice for all of us?
I am reminded here of Carl Sagan's prescient warning during a 1996 interview about the importance of understanding the nature of science and our rapidly evolving technological milieu:
We've arranged a society based on science and technology, in which nobody understands anything about science and technology. And this combustible mixture of ignorance and power, sooner or later, is going to blow up in our faces.
This is precisely what's happening today. It is, I would argue, one of the most important trend lines to watch, as it binds together everything mentioned above: the future of terrorism (at home and abroad), rise of anti-intellectualism, decline of democracy, rejection of public health advice, spread of conspiracy theories, proliferation of fake news and denial of anthropogenic environmental crises that now threaten the very livability of Spaceship Earth. How can we possibly reverse these trends? How can we hope to navigate the obstacle course of unprecedented hazards before us while the vessel of civilization is heading in the wrong direction? What if we are clever enough as a species to make a sprawling planetary mess of things but not clever enough to tidy things up — which is, in broad strokes, what astrobiologists call the "Doomsday Hypothesis," which states that advanced civilizations like ours self-destruct before colonizing space or becoming intergalactically communicable?
Some have in fact argued, plausibly, that "social media is making us dumber." Worse, studies suggest that higher concentrations of carbon dioxide in the ambient air will literally impair cognitive functioning, meaning that, as Daniel Grossman writes in a Yale Climate Connections article, "the fuel we burn might not only warm the planet but could also make us a bit dumber." And this is on top of an estimated 41 million IQ points that "Americans have collectively forfeited … as a result of exposure to lead, mercury, and organophosphate pesticides," according to calculations from the Harvard neurologist David Bellinger. (I am highly skeptical of IQ as a meaningful measure of intelligence, but the point stands: Neurotoxins like lead cause permanent brain damage.)
All of this is to say that at precisely the moment when humanity must confront problems of unprecedented complexity — at precisely the moment when the stakes, our survival, have never been higher — we find ourselves less capable than ever. To paraphrase Christopher Williams in his book "Terminus Brain," the human brain is the only organ in our bodies that is actively trying to destroy itself. This is unsettling on its own terms, but the fact is that, as already alluded to, "we are at the most dangerous moment in the development of humanity," as the late Stephen Hawking wrote in 2016. According to a 2012 World Wildlife Fund (WWF) report, "humanity must now produce more food in the next four decades than we have in the last 8,000 years of agriculture combined," while already upwards of 811 million people are undernourished, resulting in "the largest humanitarian crisis since the creation of the UN," to quote UN humanitarian chief Stephen O'Brien. "We stand at a critical point in history."
Yet not only is the global population expected to grow to 11.2 billion by the century's end (as a point of reference, the population was about 2 billion in 1930), but climate change, soil degradation, ocean acidification, overfishing, habitat destruction, ecosystem fragmentation, pollution and other anthropogenic perturbations threaten to destroy large portions of the biosphere upon which we depend for our survival. Consider, for example, a 2006 study that estimated there would be no more wild-caught seafood by 2048, which could have devastating effects given that "fish now accounts for almost 17 percent of the global population's intake of protein." Meanwhile, a recent count from the marine biologist Robert Diaz and colleagues found more than 500 "dead zones" around the world, meaning hypoxic regions in which nearly all aquatic life is impossible.
Making matters worse, about 26 percent of the CO2 released into the atmosphere ends up being absorbed by the oceans, which then produces carbonic acid. Today, this process is occurring at an incredibly rapid rate — about four times as fast as the oceans acidified during the worst mass extinction event in life's 3.8-billion-year history, called the "Great Dying" or "End-Permian Extinction," during which roughly 81 percent of marine species perished. The situation is so dire that the shells of "tiny marine snails that live along North America's western coast" are actually dissolving, resulting in "pitted textures" that give the shells a "cauliflower" or "sandpaper" appearance.
On land, the 2020 Living Planet Report, published by the WWF, finds that the global population of wild vertebrates has fallen by a mind-boggling 68 percent since 1970. Other studies have affirmed that, as a result of global biodiversity loss, we are entering the sixth major mass extinction in biological history, and that continued degradation risks initiating a catastrophic, irreversible collapse of the global ecosystem.
"Comparison of the present extent of planetary change," the authors write, "with that characterizing past global-scale state shifts, and the enormous global forcings we continue to exert, suggests that another global-scale state shift is highly plausible within decades to centuries, if it has not already been initiated." A more recent paper co-authored by some of the most prominent scientists in the world warns that "self-reinforcing feedbacks could push the Earth System toward a planetary threshold that, if crossed, could prevent stabilization of the climate at intermediate temperature rises and cause continued warming on a 'Hothouse Earth' pathway even as human emissions are reduced." A "Hothouse Earth" scenario would, they add, likely "be uncontrollable and dangerous to many … and it poses severe risks for health, economies, political stability (especially for the most climate vulnerable), and ultimately, the habitability of the planet for humans."
Averting the worst-case outcomes will require a concerted global effort, a degree of geopolitical coordination and adoption of science-based policies never before seen in human history. Yet, as mentioned, many of the most important trend lines are pointing in exactly the wrong direction. Many people on the political right, for example, are no less worried than Greta Thunberg and her activist peers about societal collapse, although the spotlight of their attention is focused entirely on the illusory threats of "postmodern neo-Marxism," "wokeness" and CRT rather than the genuine risks associated with climate change, biodiversity loss, emerging technologies, future infectious disease outbreaks and so on. So far, the response to the COVID-19 pandemic has not been encouraging: Even 1.5 years into this global disaster, "a whopping 29 percent of Republicans" refuse to get vaccinated. I have little doubt that even once the deleterious effects of climate change are ubiquitous in the U.S. — and we're quickly getting there — many will persist in denialism, as a strategy for alleviating their cognitive dissonance.
Alternatively, given the prevalence of evangelical dispensationalism in the U.S. — a view that, as the philosopher Jerry Wells observes, "inclines its adherents not only to despair of changing the world for good, but even to take a certain grim satisfaction in the face of wars and natural disasters" — ecological catastrophes could actually reinforce the expectation of an imminent apocalypse, thereby encouraging further inaction on curbing the environmental crisis. In other words, the environmental crisis could strengthen religious belief, which could in turn exacerbate the problem. Although Christianity is waning in the U.S., it is growing globally, with PEW projecting another 750 million Christians in total by 2050. (Meanwhile, the percentage of religiously "unaffiliated" people will shrink from 16.4 to 13.2 percent over the same time period.)
So, the trend lines are, in the most crucial respects, no less dismal than the headlines, contra New Optimists like Pinker — whose scholarship on "existential threats" is, as I have shown, seriously flawed. There is every reason to be pessimistic, not for mindless ideological reasons but because the balance of evidence suggests that humanity has dug itself into a hole from which escape looks increasingly impossible. In 1961, at the height of the Cold War, slightly more than a year before the Cuban missile crisis — later described by Arthur Schlesinger as "the most dangerous moment in human history" — President John F. Kennedy gave an address before the UN General Assembly in which he declared that
today, every inhabitant of this planet must contemplate the day when this planet may no longer be habitable. Every man, woman and child lives under a nuclear sword of Damocles, hanging by the slenderest of threads, capable of being cut at any moment by accident or miscalculation or by madness. The weapons of war must be abolished before they abolish us.
We still live under this nuclear sword, which is one reason the iconic Doomsday Clock lingers ominously at a mere 100 seconds before midnight, or doom. But the nuclear sword has been joined by the swords of climate change, biodiversity loss and dangerous new technologies, while deepfakes, social media, environmentally mediated cognitive decline, anti-intellectualism, the democratic recession and foreign policy blunders threaten humanity with "death by a million cuts," as it were. I have no idea how this ends — perhaps the Doomsday Hypothesis really does explain the eerie silence of our galactic neighborhood — but now is the time to do everything we can, together and as individuals, to steer the ship of humanity in the direction of safety. Time is of the essence, and the stakes are no less than our survival as a species.