"Where Good Ideas Come From": Epiphanies are overrated

Steven Johnson explains the real science of innovation -- and how some companies, like Google, are mastering it

Published October 12, 2010 11:01AM (EDT)

Where do brilliant ideas come from? When reporters ask Tim Berners-Lee about the moment he conceived of the World Wide Web, he can't answer. He hasn't forgotten, it just never happened. The idea percolated in his mind for nearly a decade, based on a desire to organize massive amounts of data shared between connected computers. He needed ideas of others to buzz around him and he needed an image that would make his idea understandable. His "stack" of information became a "mesh" before eventually becoming a "web." The cliché did not hold true: His moment of insight, as it turns out, wasn't the result of a single flashbulb going off in his brain.

In his sixth book, "Where Good Ideas Come From: The Natural History of Innovation," popular science writer Steven Johnson tries to dispel the notion of the "eureka moment." As with nature, new concepts, like the Internet, slowly grow out of old concepts. They don't spring forth from nowhere. Darwin's theory, for instance, was built on centuries of observation, including his own. During his fateful voyage on the HMS Beagle, Darwin also discovered that atolls, islands made of coral, were created through the lives and deaths of tropical marine organisms, hardened bodies built up on one another. This key image, according to Johnson, gave Darwin a picture for his epic explanation of how life emerged. Using natural science's tendencies to build upon itself, as well as examples of major innovations in science, technology and even art, Johnson makes a case that ideas beget ideas, which means would-be innovators don't need an ivory tower; they need a crowd.

We spoke with Johnson from his home in Brooklyn, N.Y., about how openly collaborative communities are out-inventing corporations, whether meetings are good or bad for new ideas, and what exactly counts as innovation.

Why don't you agree with the notion that most good ideas come from epiphanies?

What you end up seeing when you look at history is that people who have been good at pushing the boundaries of possibility, and exploring those frontiers of good ideas and innovations, have rarely done it in moments of great inspiration. They don't just have a brilliant breakthrough idea out of nowhere and leap ahead of everyone else. Their concepts take time to develop and incubate and sit around in the back of their minds sometimes for decades. It's cobbled together from other people's ideas and other people's technologies and other people's innovations. It's a remixed version of something. A great example from the book is Tim Berners-Lee and the Web.

But, as you point out in the book, Charles Babbage seems to have invented the computer over 100 years before the computer as we know it was possible.

Babbage was trying to invent a digital computer with Industrial Age parts. This big clanking, industrial steam-powered structure. On some level, it was right. If he had been able to build it, it might have actually worked as a programmable computer, but it just was too complicated to do that without vacuum tubes or, even better, integrated circuits and silicon chips. He also invented what is now called the calculator and it actually kind of worked. People learned and improved upon it. There's a path of mechanical calculation that runs through the 19th century where people are advancing it step-by-step. But the early computer was so far ahead of its time that it just kind of died off and many of Babbage's most crucial ideas had to be independently rediscovered 60 or 70 years later. He was so far ahead of his time he couldn't have a direct line of influence, because people couldn't figure out what to do with his idea.

Does that mean that there is no such thing as individual inspiration? Are you saying great ideas come from a "hive mind."

No, I wrote a book celebrating the hive mind and that's my book "Emergence." And I do think that there are things that true collective decision-making is capable of doing. In that book I talk about building city neighborhoods, I talk about ant colonies. But this book is not about that. It's not that we all get together and collectively contribute tiny pieces and out of the sum of our actions a good idea is formed. What I'm saying is individuals have better ideas if they're connected to rich, diverse networks of other individuals. If you put yourself in an environment with lots of different perspectives, you yourself are going to have better, sharper, more original ideas. It's not that the network is smart. It's that you are smarter because you're connected to the network.

Well, that brings me to something in the book that bummed me out. You cite a study that observed science labs and found the breakthroughs happened more often during staff meetings than at the microscope. I hate meetings.

It's funny that you say that, because I hate meetings too. I love those stretches where I've just been a writer -- when I haven't been doing Internet start-ups -- where I pretty much eliminate meetings from my life. But there are different kinds of meetings. What the research found was that it was the weekly status update meeting that was so generative. It was when everybody would get together and tell stories about what they were working on and the problems they were having in their particular work. That's very different from the meeting where you're getting together to discuss the annual budget. When it's a sharing and improvisational meeting, where you're riffing off other people's ideas, that actually can be productive.

But a number of studies have found that meetings are a staggering waste of people's time when they're not done well. So you can keep your dislike of meetings.

Thank you. You talk about companies such as Google, which instituted time off for thinking about nothing but innovation. The response from traditional companies is going to be pretty obvious -- they can't afford it, it's a waste of resources and the change is too radical.

The problem is that most traditional companies that don't operate like Google talk a big game about innovation and making their workforce more creative, but what that ends up coming down to is a corporate retreat once a year where everybody goes and plays games to make everyone more creative. Then they come back and they're back in their normal structure. In that case, you might stumble across some interesting things, but six months later, somebody has that hunch and if there's no place to allow it to be nurtured, it disappears.

And so you need this permanent track of hunches and half-baked ideas that runs alongside the regular work-week with its immediate deadlines and fixed concepts. Innovation time off means you're always spending a little bit of your time working on something weird that's not part of the official plan, but might turn into something important if we give it enough time. I think having that background process is really important.

You argue that "non-market networks," such as universities, were responsible for more innovation in the past 200 years than even corporations. I don't think that is the assumption most people would make.

When you make arguments about how culture works, they can be hard to prove. It's not like a science experiment. Most of the book is made of anecdotes that fit the various patterns that I'm talking about. You always run the risk in this kind of book of cherry-picking your anecdotes. So in the last chapter I tried to zoom out far enough and look at 200 to 300 stories from the last 400 to 500 years to really see from the long view what the patterns of innovation were. And it turns out that groups of people collaborating on ideas to advance science or technology without the goal of proprietary ownership are actually a bigger driver of innovation than the private sector. In many cases these other [nonprofit-minded] groups create ideas which allow commercial development on top of them. The Internet is the classic example. 

By Michael Humphrey

Michael Humphrey is a former editorial fellow at Salon. He is a contributor to Forbes.com and currently teaches at Colorado State University.

MORE FROM Michael Humphrey

Related Topics ------------------------------------------

Books Nonfiction