21st: Little crashes lead to big crashes

Salon 21st: Little crashes lead to big crashes: By Andrew Leonard. Today's computer networks allow less and less "slack" for error. Yet we depend on them more and more to run our banks and airlines, our governments and wars. According to the author of "Trapped in the Net," we're asking for trouble.

Published September 21, 1997 7:00PM (EDT)

Gene Rochlin likes to tell a simple story. One day, around noon, he turned on his computer, checked his e-mail -- and was surprised, even shocked, to read an angry missive demanding to know why he hadn't already responded to a message he'd been sent at 9 that morning.

"I thought, wait a minute! What makes you think that I'm sitting here with my mail program going beep-beep-beep at me whenever a letter comes in?" says Rochlin. "I turn on my computer when I need to."

To anyone who has come to rely on e-mail as a way of life, Rochlin's story will ring a bell. But to Rochlin, a professor of energy and resources at the University of California at Berkeley, the incident is symptomatic of a deeper illness: the lack of "slack" in a world increasingly dominated by computer technology.

Computers were designed to extend human powers, argues Rochlin, but their widespread deployment is actually reducing human options and constraining human maneuverability -- not least when computers are linked together in powerful networks. In the military, in corporations, in aircraft cockpits and nuclear power plants, computers are forcing humans into dependency and powerlessness. And there's no easy way out. We are, proclaims Rochlin in the title of his most recent book, "Trapped in the Net."

Apostles of the digital revolution tend to dismiss any and all critics of the implications of computer technology as ignorant neo-Luddites attempting to roll back the inevitable tide of progress. But Rochlin, a past winner of Guggenheim and MacArthur grants and one of the world's premier experts on plutonium disposal policy, is no Luddite, neo or otherwise. He's spent a lifetime studying "large technical systems" -- huge organizations that incorporate complex technology. His own home computer is equipped with the latest versions of Netscape and Eudora, Microsoft Word and Shockwave. He doesn't even know how he would communicate with his colleagues and students without e-mail. But the more deeply he personally has come to depend on his computer, the more fearful he is -- and not simply because he's irritated with impatient correspondents.

"The faster you can communicate, the more rapidly you can get a response," says Rochlin, "but the more rapid the expectation for turnaround time becomes. So in the end, in some funny way, you haven't bought yourself more time but less time."

"The result," says Rochlin, "is technical systems in which the response of the system to anything you do that might be an error is so fast that you don't really have any time to think, you just go. The concern is that you can start a sequence of events in which you do something, then the machine does something, you do something, the machine does something and your cognitive abilities are lagging behind the reality of the interaction. But you can't stop and say no."

Rochlin calls this accelerating feedback process "slack consumption." And he concedes that in the domain of normal e-mail exchanges, it might not be such a big deal. A blown off e-mail message rarely means the end of the world. But what happens when such computer automation and network interaction occur in a "safety critical" situation, like an air traffic control center during a thunderstorm or a battleship in a firefight? What happens when the computer and the network place demands upon us that we are constitutionally incapable of handling?

That's Rochlin's concern -- and it is far from theoretical. In "Trapped in the Net: the Unanticipated Consequences of Computerization" (Princeton University Press), Rochlin details how financial markets, the air transport industry and -- most controversially -- the military are all on the verge of putting human beings in a position of subservience to the technology that is supposedly designed to liberate them. Even the network -- that supposed instrument of ultimate flexibility and resilience -- is, in Rochlin's view, helping to relegate humans to second-class status.

Additionally, in many cases, argues Rochlin, the ascendancy of the networked computer has resulted in the reassertion of central control over the individual.

"There was a time when everybody had their own computer," says Rochlin. "Everyone had their own software -- and there was a feeling that these individual workstations were adaptable. A lot of people were excited about that, and felt it was a liberating experience."

Then came the network. In the corporate world, managers suddenly had a means to watch over their workers -- to the point where keystroke speed can be monitored, live, and every e-mail intercepted. And no longer were users free to pick their favorite software: A multitude of operating systems and applications is less practical when everyone in a company is supposed to be linked together.

Rochlin concedes that the rise of the cross-platform Internet may be counteracting some of these trends -- for example, allowing some people to move from corporate clutches into new independent lives as freelance contractors. But in the businesses he focuses on, human lives are at stake. And there, he sees the drive for greater efficiency and faster responsiveness as the main springs of the trap that is snapping closed upon us.

Aircraft cockpit automation and air traffic control centers provide some of Rochlin's most compelling examples. According to Rochlin, right now, air traffic control systems operate at the limit of human comprehension. Controllers are barely able to maintain a "cognitive map" of what is happening in the airspace under their purview. If something forces them to lose hold of that map -- a phenomenon that Rochlin calls "losing the bubble" -- controllers can still hit the "pause button": They can slow things down and order planes not to take off or maintain holding patterns.

The air transport industry, however, wants to upgrade air traffic control technology to allow shorter distances between planes in the air -- thus increasing the traffic capacity of an airport. But then, human beings will no longer be able to maintain any kind of cognitive map of the situation, says Rochlin -- they will become completely reliant on their automated systems. The consequences, if something goes wrong, could be devastating, he argues. Manual controllers will not be up able to handle levels of traffic made possible only through computer automation. Aircraft will collide.

Even if it's possible for controllers to take manual control in a crisis, Rochlin wonders if in the future they will have the experience to know what to do. Turning to the related arena of aircraft navigation, Rochlin notes that a whole generation of airline pilots has come to depend on automation. These pilots have never had the chance to develop the instincts necessary to keep flying when the computers go awry.

Rochlin maintains that our increasing reliance on computers and networks is making our infrastructure more, not less, vulnerable. As an example, he cites the Kobe earthquake of January 1995. Major manufacturers in Kobe, as in most Japanese industrial centers, relied on "just-in-time" production systems that depended on extensive telecommunication facilities and computerized inventories and ordering. Many of these companies went out of business forever after the earthquake, utterly unable to recover from a single devastating blow.

Perhaps most outrageously, Rochlin suggests that the U.S. armed forces may be headed for a similar meltdown. "Trapped in the Net" devotes four chapters to analyzing trends in military computerization, including a detailed look at the Gulf War.

The Gulf War is widely regarded as a public-relations triumph for the high-tech U.S. military. But Rochlin notes that the command-and-control coordination implemented in the Persian Gulf required a whopping majority of the high-tech electronics equipment and supply systems that the military had available for use in the entire world. In other words, if there had been another major threat to the United States at the same time, the military would have had a hard time stretching its resources to meet it.

Even more intriguing, modern electronics have created a military that demands and expects total coordination between all branches and a central command -- and from the central command all the way down to individual tank commanders and fighter pilots. The military now operates under the illusion that it can, and should, completely control every movement of its forces in real time.

"With the addition of the new electronic capabilities," writes Rochlin, "the entire, extended battle, from front to logistics, from close air support to strategic interdiction, could now be visualized and managed as a single, integrated whole ... But in striking parallel to the restraints placed on industrial and office workers by the demands and requirements of their automated and integrated equipment, battlefield commanders now have less discretion than ever at the critical level of operations."

Less discretion, argues Rochlin, implies less flexibility to respond to local circumstances -- and that could be a disaster in a war against an opponent that puts up more of a fight than the Iraqi military.

What can be done? Not a whole lot, says Rochlin. There are no 12-step groups to pull large technical systems out of computer-dependency relationships. But where there is still a chance, says Rochlin, designers must accept the imperative to be, well, a little bit messy.

"We should be sloppier," says Rochlin. "Sloppy is a condition of life ... Don't worry if the system is not efficient, because at some point in your life you will need sloppiness and inefficiency. Designers should put a buffer into the system, something that cushions you so you have a chance to think. "

If we don't add buffers, or permit slack, we're in trouble. As Rochlin concludes in "Trapped In The Net":

"They [computers] require constant, intelligent, and informed monitoring. Over time, they will be increasingly out of sight, but they must never be out of mind. Otherwise it is we, and not the computers, who will become invisible idiots."


By Andrew Leonard

Andrew Leonard is a staff writer at Salon. On Twitter, @koxinga21.

MORE FROM Andrew Leonard


Related Topics ------------------------------------------