"Armageddon Science": Our coming apocalypse, explained

Nuclear war, bioterrorism, nanorobots: What poses the greatest threat to our planet? An expert explains the facts

Published November 8, 2010 8:55PM (EST)

Human in gas mask outdoors and industrial factory on a background (Arman Zhenikeyev)
Human in gas mask outdoors and industrial factory on a background (Arman Zhenikeyev)

We are fascinated by the end of the world. Every summer, Hollywood rolls out a slate of foreboding disaster films, like "Armageddon," "The Terminator" or "The Day the Earth Stood Still," that capitalize on our desire to see our planet get obliterated. This coming weekend, moviegoers can watch aliens trample Los Angeles in "Skyline." Books like Cormac McCarthy's "The Road," with its dystopian vision of a plantless future, become cultural phenomena by feeding our anxieties about mortality and our own global expiration date. But how grounded are these fears in reality?

In "Armageddon Science: The Science of Mass Destruction," English science writer Brian Clegg, who has an advanced degree in physics from Cambridge University, considers the threats, both real and theoretical, that science and technology pose to the world. He searches beyond the obvious examples of nuclear warfare and global warming and introduces such strange concepts as antimatter bombs, nanorobots and cyberterrorist war. Despite the unnerving title and the alarming cover art of a post-apocalyptic city, Clegg's book presents a sober and rational analysis of the threats -- or lack thereof -- that we face. He dismisses several Armageddon scenarios, such as dark matter explosions or world-dominating robots, as unlikely. But he's less cavalier about other doomsday possibilities.

To find out if we should start building shuttles to Mars, Salon spoke to Clegg over the phone, from the U.K.

Let's cut to the chase. What product of science and technology is the most likely to wipe out the human race?

As long as we still have nuclear arsenals, in the end I have to go down that line. The other biggest danger to humanity is natural disasters, though they're not the main theme of my book. People talk about how humans will destroy the planet, but there's actually nothing we can do to destroy the planet. We can destroy ourselves, and the planet can destroy us.

Deep underground in Switzerland, an international group of scientists built what you call "the biggest machine ever envisaged by human beings," the Large Hadron Collider (LHC), which is a high-tech particle accelerator. There have been concerns it could lead to destruction. Is the machine dangerous?

The real worries are that the machine will produce tiny black holes or these [hypothetical particles] called strangelets. Neither of these possibilities should cause much worry. There's a popular image of a black hole that eats everything around it, but in reality the holes that could be produced by the LHC would disappear before you could even see them.

You refer to the practice of experimental particle physics as "childish." What do you mean by that?

It's more "childlike" than "childish." What we're doing with particle physics is the scientific equivalent of hitting something with a large hammer to see what happens. It's the only way to find out what's going on. In some ways, this is better than observation-based science. In a field like psychology, you can't experiment on human beings to look at everything that's going on, whereas in physics you can. Experimental physics also has this advantage over cosmology. You can't experiment with the galaxy and the stars; you can only observe.

What do you think of Dan Brown's ill-informed account of antimatter technology that can be used to blow up the Vatican in "Angels and Demons"?

I have no problem with Dan Brown getting the science wrong, but he structured the book in a way that presented this stuff as factual, which is pushing it a little bit.

You discuss the fact that scientists are more likely to be on the autism spectrum. Do you think this hinders their ability to contemplate how their findings might affect people?

I don't think that's an issue. People on the autism spectrum have difficulty with aspects of social relations and dealing with people, but this doesn't mean they don't understand the dangers of science and technology, and it doesn't mean they aren't concerned about other people.

To some extent, I think this explains the stereotypical view of the scientist as someone who is cold and impersonal. It's definitely not true that every scientist is on the spectrum -- there's just a higher prevalence of autistic disorders than in the general population. This might also partially explain why men are still so much more likely to be scientists than women, since men are more likely to be on the autism spectrum.

Bioterrorism versus cyberterrorism: Which is scarier?

Cyberterrorism. In the U.K., the government just completed a new assessment of the requirements of the military, and cyberterrorism was one of the top three threats. Cyberterrorism is relatively easy to accomplish, and relatively easy to do remotely, in terms of hacking into the Internet.

Bioterrorism is much easier than a terrorist nuclear attack, but it's not that easy. For the terrorists, the difficulty is in making a large impact. The anthrax attack after 9/11 had an effect, but it killed a very small number of people. It's actually quite difficult to deliver biological weapons that will infect people properly.

How concerned should we be about the dangers of nuclear terrorism?

I don't think it's the most dangerous possibility in the sense of all-out mass destruction. A nuclear terrorist event would be small compared to what a superpower can do, but I think it has a larger probability of actually happening. A dirty bomb, which is a regular bomb that spreads radioactive material, is the most likely scenario. An actual nuclear bomb would be very difficult for a terrorist organization to construct. Their best bet would be getting one from a nuclear nation.

The U.S. government convened the Committee on Medical Preparedness for a Terrorist Nuclear Event in July 2009. It almost seems like a throwback to the Cold War. What was it?

It's a committee put together by the Institute of Medicine at the government's request. The purpose is to increase awareness of the possibility of some form of terrorist nuclear event. The approach reminds me of the publications that were produced as information leaflets for the general public during the Cold War. It all seemed trivial back then -- stuff like "stay away from the windows, put your back against the walls." The Committee on Medical Preparedness suggests stuff like getting into a basement or the central core of a building, because a lot of radiation is actually stopped by simple things like bricks.

During the Manhattan Project, physicist Niels Bohr argued that all nations should share their knowledge of nuclear technology. After the bombs were dropped in Hiroshima and Nagasaki, a proposed U.N. committee would share and guard nuclear technology. This never happened. Is international oversight of this kind still possible?

What Bohr had in mind wasn't that every nation should have a nuclear bomb, but that everyone should have the knowledge of what's involved in making one. This form of deterrence might have prevented nations from ever building the bomb, because each nation would know that if they started to build, every other nation might build as well. It's a different approach to deterrence. Unfortunately, nations did build bombs, so I'm not sure how realistic such a scenario is now, but it would have been nice.

The United States has the largest nuclear arsenal. It also has a national memory that doesn't include mass death and genocide: The majority of victims from the World Wars, for example, were not American. Do you think this makes Americans psychologically naive to the true costs of warfare?

If you look at the American reaction to 9/11, which in a sense was one of the very few attacks on the United States from the outside, the response was huge. This could have been influenced by the lack of experience of massive death tolls from outside attack. Back in the early days of nuclear weapons, people in the United States military were seriously suggesting an all-out nuclear attack on Russia. That certainly demonstrates the kind of naiveté you're mentioning. The Cuban missile crisis might have been a coming-of-age for the United States. There was a realization that America is not invulnerable, and this brought the government past their naiveté.

You devote a good deal of the book to global warming. Higher temperatures and rising sea levels cause natural disasters and drought, but why is it so difficult to pinpoint a timeframe for these events?

Arguably, these events are happening already. One of the problems with the public's understanding of climate change is that the numbers coming from the scientists represent worldwide averages. In a particular place, you might experience an extreme. Scientists are wary to say, for example, that Hurricane Katrina was caused by climate change, but there is a reasonable feeling that problems during the past 10 years have actually been caused by global warming. Scientific predictions for the near future regarding increases in average temperatures, rising sea levels, the melting of the ice -- so far the results have actually proven to be worse and more extreme than the scientists predicted.

What do we do about it?

Large-scale solutions include ideas like seeding the sea with iron -- which encourages algae growth, which then eats up carbon -- or putting up big sunshades in space. There's no doubt that these solutions could have an impact, but introducing such a large change into earth's extremely complicated system has risks. We need to really understand these ideas before we use them. Reducing our carbon emissions doesn't have a risk, so this seems safer to me.

Thirty-five thousand Europeans died in the summer heat wave of 2003, particularly in cities. Why is heat more dangerous in urban areas?

The problem is what's called the urban heat island effect. A city stores up heat during the day, and then gives that heat out at night, so individuals in cities are exposed to constant heat. In 1995, hundreds of people died in Chicago. Another problem is that heat rises, and many people live in tall buildings in cities. We have to start learning lessons from countries that experience this kind of heat on a regular basis. In some areas of India, people go up to the hills during the summer. Perhaps we will get to the point where cities are evacuated during the summer.

What are nanobots? How do they relate to your claim that "perhaps the most subtle peril the human race faces is that we could cease to be human beings at all"?

A nanobot is an incredibly small robot. On the medical side, you could theoretically use these things to attack viruses and bacteria, to repair problems with the heart, or to replace organs. Some think this is the way humanity will go: Nanobots will be part of our bodies, and they can keep us alive forever. The "Star Trek" idea is that a nanotechnology device will produce whatever food you want immediately. Nanotechnology is built up atom by atom, so we could program nanobots to build things in this way.

The downside is what happens if these things go wrong. There are two scenarios. One is that since nanobots need energy, there's no reason why they wouldn't eat us, especially if we're injecting them into the body. The other possibility is the reverse of the nanobot-as-assembler. Nanobots could take things apart atom by atom. But I think the whole thing is a very long-term concern. Just having a robot that works is difficult enough.

You discuss famed theorist Ray Kurzeil and his idea of the Singularity -- a point at which technology becomes superintelligent, which he believes will occur in 2040. You conclude that "Those who predict the Singularity have a false picture of the nature of humanity." Why do you think the idea of the Singularity is so popular?

The Singularity came from the ideas of science-fiction writer Vernor Vinge. It's like the Borg in "Star Trek": something that is a combination of living creature and machine. Followers of the Singularity believe the machine will become more and more intelligent, and that this new creature will be the next step. Homo sapiens will either be subjugated or wiped out.

Frankly, I think the time scales are unrealistic. There is no doubt that technology is going to be used to enhance human beings. We've been doing this for a long time. You could say that the dog enhanced human ability; we created dogs from the wolf and used them for our benefit. In the modern sense, we use technology inside our bodies. But technology always jumps in a different direction than you might expect. The people who talk about the Singularity look at computer technology and medical technology that puts stuff in our bodies and assume the technology will continue to progress in the exact same direction. I think we're much more likely to end up with technology that interacts with us rather than becoming a part of us.

The Singularity is a rather crude idea, and the 2040 prediction is ridiculous, especially if you look at how little cybertechnology has actually developed. Ray Kurzeil has this idea of wanting to live forever, and hoped to keep himself going until the technology developed, so the Singularity is a kind of wish fulfillment. Most of the scientific community, frankly, views him as an eccentric. He gives himself blood transfusions and eats vast numbers of supplements every day, and these are things that actually have very little scientific basis for prolonging life. The Singularity is popular because everyone wants to live forever. It's a very appealing idea.

What should we do about Armageddon scenarios, other than just be freaked out?

We should encourage people to get a better education in science. In the end, decisions that relate to scientific discoveries are in the hands of politicians, and a lot of decisions are made with very little scientific knowledge. If everyone had a better understanding of the possible dangers in the application of science, we will realize, "OK, we can influence the politicians, so we ought to understand what's going on here at a more fundamental level."

I try to end on a cautiously optimistic note. We have gained so much from science. Like anything else, it can be misused. And we still face major risks from natural disasters. But we do now have the capacity for mass destruction, and we have to be more careful, more thoughtful and hopefully more knowledgeable about how we deal with science and technology.

By Katherine Don

Katherine Don is a freelance writer in New York.

MORE FROM Katherine Don

Related Topics ------------------------------------------

Books Nonfiction Our Picks