On April 17, 1961, more than a thousand Cuban-exile paramilitary fighters landed at the Bay of Pigs, off the southern coast of Cuba, in an audacious effort to convince the local population to join a march toward Havana and overthrow Fidel Castro. Instead of finding pliant locals, though, the exiles, who'd been trained and armed by the American Central Intelligence Agency, were quickly and ignominiously defeated by Castro's forces.
CIA investigators would later determine that the agency's invasion plans had been seriously misguided -- the CIA had underestimated Castro's popularity and the strength of his military, it had failed to lay down any real plans for the post-invasion period, and it mistakenly assumed that the U.S. could plausibly deny any involvement in the mission, even though the press had been reporting on the exiles' training for months prior to the invasion.
Why did the Kennedy administration launch this flawed mission? Because it "planned and carried out its strategy without ever really talking to anyone who was skeptical of the prospects of success," writes James Surowiecki in "The Wisdom of Crowds," his intriguing new book that examines the ways in which groups of people can become either smarter or dumber than their brightest members. Surowiecki, who writes the financial column for the New Yorker, holds up Kennedy's foreign policy team as an example of one of the perils of group deliberation. Specifically, the Kennedy White House was the victim of what psychologist Irving Janis, who studied the decision-making process that led to the Bay of Pigs invasion, called "groupthink" -- a phenomenon that causes small, homogeneous groups to unconsciously squelch dissent and ignore positions that run contrary to the group's point of view, leading to often bizarre decisions.
Groupthink seems to run rampant in the halls of government these days, and Surowiecki does a service to the nation in untangling the causes of this menace. But he points out that groups can be felled by much more than groupthink. In Surowiecki's view, people form precarious collections; if we're not organized just right, our groups can make terrible decisions. The intelligence of a team, or of a much larger crowd, can be diminished if its members are not intellectually diverse, if its members do not make decisions independently of each other, and if the team is either too centralized or too decentralized. The network of open-source programmers who maintain Linux is an example of a group that gets things right; the team of NASA engineers that managed the space shuttle Columbia's last mission is one that got things disastrously wrong.
Surowiecki's is a big-idea book, reminiscent of his New Yorker colleague Malcolm Gladwell's "The Tipping Point." Although he spends much of the book pointing out all that can go wrong in a group or a crowd, Surowiecki's big idea is that a well-designed group can be exceptionally intelligent, smarter even than its brightest member. Ask a crowd to guess the number of jelly beans in a jar and the average of their guesses will be almost exactly right. In the same way, people organized in markets, in betting pools, in corporations or even in governments can sometimes magically arrive at the correct solution to a tough problem.
In book form, Surowiecki's writing sometimes meanders, and it lacks much of the elegance, and some of the delight, of his short pieces for the New Yorker. He also has a tendency to glut his arguments with apparently tangential examples, some of which raise more questions than they answer. Still, Surowiecki puts forward a compelling idea. In our world, we tend to think of intelligence as the province of individual "experts," but Surowiecki convincingly argues that under the right circumstances, groups might be far better equipped to manage human affairs. In other words, this book should send corporate CEOs -- the highest-paid experts in the land -- running scared.
Surowiecki's faith in the wisdom of crowds rests on what he calls "a mathematical truism." If you ask a large group of diverse, independent people to predict or estimate something, when you average out their answers "the errors each of them makes ... will cancel themselves out," he writes. "Each person's guess, you might say, has two components: information and error. Subtract the error and you're left with the information."
This simple fact can lead to remarkable results. Surowiecki points out, for instance, that within minutes of the explosion of the space shuttle Challenger, on Jan. 28, 1986, the stock market collectively determined which company to blame for the accident -- and it got the right answer. While share prices of the four major contractors who'd worked on the shuttle declined after the accident, the stock of Morton Thiokol, the contractor that built the Challenger's solid-fuel booster rockets, slipped the most. By the end of the trading day, the value of its shares had fallen by almost 12 percent, while the other three contractors had dropped by around 3 percent.
The shares of Morton Thiokol did not fall so precipitously because it had been publicly blamed for the accident, or because insiders at the firm were dumping stock; as Surowiecki notes, on the day of the explosion, there were no public comments indicating that Thiokol might bear some responsibility for what had happened, and even rumors concerning possible causes of the disaster did not implicate the company. Still, for some reason, the market chose Thiokol. And six months later, the commission investigating the disaster also blamed Thiokol. It was faulty O-ring seals on Thiokol's booster rockets, the commission determined, that led to the Challenger's demise.
How did the market correctly pick which firm to blame? Surowiecki concedes that it's possible that someone in the market had inside knowledge of the defective O-rings. "But even if no one did, it's plausible that once you aggregated all the bits of information about the explosion that all the traders in the market had in their heads that day, it added up to something close to the truth," he writes.
Surowiecki is (understandably) captivated by this power of the market to aggregate different peoples' ideas into a single "truth." In business, as in much of the rest of life, predicting the future would be extremely valuable. Markets, Surowiecki says, can sometimes do an astonishingly good job of predicting which one of a number of different outcomes might come to pass. For some reason, though, many people, even businesspeople, shun markets; when we want to know what will happen in the future, we almost instinctively seek out experts rather than the crowd.
While corporations rely on the stock market to raise capital, Surowiecki points out that what's done with that capital in most American businesses is often decided not by a market but instead by the whim of one "expert" -- the CEO. And in the 1990s, these men were anointed as the superheroes of American business, pulling in obscenely large salaries and becoming cultural celebrities as a result of their supposed business prowess.
The problem was not just the hype, Surowiecki writes. "The problem was that people actually believed the hype, taking it for granted that putting the right individual at the top was the key to corporate success." But this idea is deeply misguided. There is precious little evidence to indicate "that single individuals can consistently make superior forecasts or strategic decisions in the face of genuine uncertainty," Surowiecki writes. Among academics, there is a debate over whether CEOs can make any dent on corporate performance, but even those who say that CEOs matter point out that their influence can be either positive or negative. For instance, two-thirds of all corporate mergers -- almost always the brainchild of a lone CEO -- fail. "This suggests that, at the very least, CEOs are not in general extraordinary decision makers," Surowiecki writes.
Instead of having a CEO choose the company's strategy, why not have employees participate in an internal market to set the course for a firm? Companies have been extremely reluctant to do this, Surowiecki says, but a few pilot projects in some firms indicate that the technique could be pretty handy. A test run by the economists Charles Plott and Kay-Yut Chen at Hewlett Packard gave a small group of employees the chance to buy and sell shares in a market meant to forecast long-term printer sales at the company. The market's predictions outperformed the company's forecasts 75 percent of the time. At Innocentive, a spinoff of the drug company Eli Lilly, employees used an experimental market to predict which three of six new drugs were likely to be approved by the FDA -- and they picked the correct three, an extremely valuable trick in the high-stakes pharmaceutical world. Can a CEO who makes a hundred million dollars each year do that?
Surowiecki also has kind words for the Policy Analysis Market, the Defense Department project designed to use a market to predict threats in the Middle East. The project, which was roundly skewered as "offensive," "harebrained" and "morally wrong" when it became public in the summer of 2003, was quickly abandoned by the government.
Surowiecki believes it was a bad idea to dump the DoD project. These days, the evidence abounds that the U.S. intelligence services have little insight into the Middle East. The problem is not just that we don't know what's going on in the region, but that what we think we know is sometimes way off. What the American public thought it knew about Saddam Hussein's weapons capabilities, for instance, seems now to have not been the product of actual intelligence-gathering efforts but instead the result of spin, hype, bureaucratic infighting, a less-than-aggressive press corps, and baldfaced lies. A decision market might have mitigated such problems.
"Since traders in a market have no incentive other than making the right predictions -- that is, there are no bureaucratic or political factors influencing their decisions," Surowiecki writes, "they are more likely to offer honest evaluations instead of tailoring their opinions to fit the political climate or satisfy institutional demands." In the case of Iraq, those who knew about Saddam's actual weapons capabilities, or who were at the very least skeptical of the Bush administration's threats, might have anonymously bet against the government's claims in the PAM market. The public would then have had reason to be skeptical of the White House's claims.
It's true, of course, that markets can also get things wrong, and Surowiecki doesn't ignore this. Indeed, one of the most engaging sections of his book is an examination of what caused the breakdown of the stock market during the technology bubble. Surowiecki points to a number of reasons the market performed so badly during the boom, but his main point is that at some point in the late 1990s, players in the market lost their independence from one another. Fed by a constant diet of stock-boosting news on CNBC, investors became single-minded, as each person thoughtlessly did what he believed everyone else was doing -- buying tech stocks. The result was a seemingly endless upward spiral in stock prices, followed, suddenly, by a seemingly endless downward spiral.
Alas, there are few real solutions to avoiding bubbles in financial markets, Surowiecki explains. If you're stuck in a market in which prices keep rising for no reason, what are you going to do -- get out? If you did that, you'd lose the chance to make a killing. But if you stayed in the market, you'd risk being caught up in the crash. All you can do is hold your breath and hope for the best.
Yet despite the spasms of irrationality into which markets sometimes sink, we'd probably be better off casting our lot with groups than with individuals. Yes, stock markets sometimes do dumb things, but most of the time they're pretty smart. Individuals, on the other hands, sometimes do smart things. But it's what they do most of the time that should worry us.