Is the Web helping us evolve?

The truth lies somewhere between "Google is making us stupid" and "the Internet will liberate humanity."

Published December 23, 2008 11:50AM (EST)

Some of today's most vaunted tech philosophers are embroiled in a ferocious argument. On one side are those who think the Internet will liberate humanity, in a virtuous cycle of e-volving creativity that may culminate in new and higher forms of citizenship. Meanwhile, their diametrically gloomy critics see a kind of devolution taking hold, as millions are sucked into spirals of distraction, shallowness and homogeneity, gradually surrendering what little claim we had to the term "civilization."

Call it cyber-transcendentalists versus techno-grouches.

Both sides point to copious evidence, as Nicholas Carr recently did, in a cover story that ran in the Atlantic, titled, "Is Google Making Us Stupid?" In making the pessimists' case, Carr offered up studies showing that the new generation of multitaskers aren't nearly as good at dividing their attention effectively as they think they are. According to Carr, focus, concentration and factual knowledge are much too beneficial to toss aside in an avid pursuit of omni-awareness.

A related and even more worrisome trend is the decline of rigorously vetted expert knowledge. You wouldn't expect this to be a problem in an era when humanity knows more -- and shares information more openly -- with every passing year, month and day. Wikipedia is a compendium vastly larger than all previous encyclopedias combined, drawing millions to contribute from their own areas of micro-expertise. But the very freedom that makes the Internet so attractive also undermines the influence of gatekeepers who used to sift and extol some things over others, helping people to pick gold from dross.

In the past, their lists and guides ranged from the "Seven Liberal Arts" of Martianus Capella to "The Great Books of the Western World," from Emily Post's "Etiquette" to the Boy Scout Manual, from compulsory curricula to expert scientific testimony. Together, this shared canon gave civilized people common reference points. Only now, anyone can post a list -- or a hundred -- on Facebook. Prioritization is personal, and facts are deemed a matter of opinion.

Carr and others worry how 6 billion ships will navigate when they can no longer even agree upon a north star.

Of course, an impulse toward nostalgia has been rife in every era. When have grandparents not proclaimed that people were better, and the grass much greener, back in their day? Even the grouches' ultimate dire consequence has remained the same: the end of the world. Jeremiahs of past eras envisioned it arriving as divine retribution for fallen grace, while today's predict a doom wrought by human hands -- propelled by intemperate, reckless or ill-disciplined minds. The difference, from a certain angle, is small.

Take the dour mutterings of another grumbler, Internet entrepreneur Mark Pesce, whose dark rumination at last year's Personal Democracy Forum anticipates a dismal near-future commonwealth. One wherein expertise is lost and democracy becomes a tyranny of lobotomized imitation and short-tempered reflex, as viral YouTube moments spread everywhere instantaneously, getting everybody laughing or nodding or seething to the same memes -- an extreme resonance of reciprocal mimicry or hyper-mimesis. And everybody hyper-empowered to react impulsively at almost the speed of thought.

"All of our mass social institutions, developed at the start of the liberal era, are backed up against the same buzz saw," Pesce said. "Politics, as the most encompassing of our mass institutions, now balances on a knife edge between a past which no longer works and a future of chaos."

From there, it seems only a small step is needed to incite the sort of riled-up rabble that used to burst forth in every small town; only, future flash mobs will encompass the globe. Pesce's scenario is starkly similar to dystopias that science fiction authors Frederik Pohl and Cyril Kornbluth portrayed, back in the 1950s, as in "The Marching Morons," or Ray Bradbury in "Fahrenheit 451," with civilization homogenizing into a bland paste of imitation and dullard sameness, punctuated by intervals of mass hysteria.

Indeed, it is this very sameness -- the "flat world" celebrated by pundit Thomas Friedman -- that could demolish global peace, rather than save it. Arguing that an insufficiency of variety will eliminate our ability to inventively solve problems, Pesce dramatically extrapolates: "Fasten your seatbelts and prepare for a rapid descent into the Bellum omnia contra omnes, Thomas Hobbes' war of all against all ... Hyperconnectivity begets hypermimesis begets hyper-empowerment. After the arms race comes the war."

Wow. Isn't that cheery? Well, with Michael Crichton no longer around to propound that there "are things mankind was never meant to know," perhaps Carr and Pesce are auditioning to fill in, offering the next vivid anthem for a rising renunciation movement -- the nostalgic murmur that technology and "progress" may have already gone too far.

Responding to all of this -- on the Encyclopaedia Britannica Blog -- Clay Shirky, the technology forecaster and author of "Here Comes Everybody," presents an equally impressive array of evidence showing that the ability of individuals to autonomously scan, correlate and creatively utilize vast amounts of information is rising faster, almost daily. In the human experience, never before have so many been able to perceive, explore, compare, analyze and argue over evidence that questions rigid assumptions. How can this not lead to insights and exciting new breakthroughs at an accelerating pace?

Perhaps even fast enough to get us ahead of all our modern perplexities and problems.

Nor is this refrain new. From Jefferson and Franklin to Teilhard de Chardin and J.D. Bernal, the tech-happy zealots of progress have proclaimed a rebellious faith in human self-improvement via accumulating wisdom and ever-improving methodology.

Even some artists and writers began siding with the future, as when Bruno Bettelheim finally admitted that it was OK to read fairy tales, or when H.G. Wells stood up to Henry James over whether stories can involve social and scientific change. Confronting stodgy culture mavens, modernists and science fiction writers spurned the classical notion of "eternal verities" and the assumption that all generations will repeat the same stupidities, proclaiming instead that children can make new and different mistakes! Or even (sometimes) learn from the blunders of their parents.

Before the Internet was more than an experimental glimmer, Marshall McLuhan fizzed: "But all the conservatism in the world does not offer even token resistance to the ecological sweep of the new electric media."

This surge of tech-zealotry is taken further -- to a degree that seems almost fetishistic -- by fans of a looming "positive singularity," in which a humanity that is aided by loyal and super-smart artificial intelligence (AI) will soon transcend all known limitations, toppling all barriers before an unstoppable can-do spirit. All we need is to keep exponentiating knowledge and computing power at the rate we have, and soon those AI assistants will cure all ailments, deliver clean energy, and solve the riddle of our minds. Singulatarians such as Ray Kurzweil and John Smart foresee a rosy Aquarian age just ahead, one of accelerating openness and proliferating connectedness, unleashing human potential in something radiantly self-propelled and exponentially cornucopian.

Oy! Teilhard's bodhisattva has returned! And part of me wants to believe the transcendentalists, who think we'll all have godlike powers just in time to end poverty, save the planet, and give the baby boomers eternal life. Hal-AI-lujah!

But, alas, even the coiner of the term "technological singularity" -- author Vernor Vinge -- will tell you that it ain't necessarily so. If these wonders are going to come to pass, it won't happen in a way that's smooth, organic, automatic or pain-free. As Edward Tenner pointed out in "Why Things Bite Back," there are always surprising, unintended consequences. The ultimate, pragmatic purpose of free debate is to find most of these error modes before well-made plans go awry. A practical aim that is forgotten by those who worship self-expression entirely for its own sake.

Anyway, emergent properties help those who help themselves. Very few good things ever happened as gifts of circumstance, without iteration and hard work. Above all, we'll need to improve the tools of the Enlightenment, at an ever-increasing pace, so that Howard Rheingold's vaunted citizen-centered smart mobs really are more smart than mobs!

Too bad for us, then. Because, looking at today's lobotomizing social nets, avatar worlds and dismal "collaborationware," any rational oddsmaker would have to favor the grouches by a 10-point spread.

So, is the Google era empowering us to be better, smarter, more agile thinkers? Or devolving us into distracted, manic scatterbrains? Is technology-improved discourse going to turn us all into avid, participatory problem solvers? Or will the Web's centrifugal effects spin us all into little islands of shared conviction -- midget Nuremberg rallies -- where facts become irrelevant and any opinion can be a memic god?

Alas, both sides are right. And both are missing key points. If I must simplistically choose between Teilhardists and renunciators, my sentiments go with the optimists who helped bring us to this party we're all enjoying -- the worldwide culture and Internet that lets me share these thoughts with you and that empowers the grouches to be heard. Luddism has always been a lame and deep-down hypocritical option.

But no, this latest tiff seems to boil down to another of the infamously oversimplifying dichotomies that author Robert Wright dismantles in his important book, "Nonzero." No. 1 on our agenda of "ways I might grow up this year" ought to be giving up the foul habit of believing such tradeoffs. Like the hoary and obsolete "left-right political axis," or the either-or choice we are too frequently offered between safety and freedom.

But let's illustrate just how far off base both the mystics and the curmudgeons are, by offering a step-back perspective.

Only a generation ago, intellectuals wrung their hands over what then seemed inevitable: that the rapidly increasing pace of discovery and knowledge accumulation would force individuals to specialize more and more.

It may be hard to convey just how seriously this trend was taken 30 or so years ago. Sober academics foresaw an era when students might study half a lifetime, just to begin researching some subfield of a subfield, excruciatingly narrow and dauntingly deep, never knowing if they were duplicating work done by others, never cross-fertilizing or sharing with other domains. It reflected the one monotonic trend of the 20th century -- a professionalization of all things.

It's funny, though. You just don't hear much about fear of overspecialization anymore. Yet has the tsunami of new knowledge ceased? If anything, it has speeded up. Then why did that worry go away?

As it turned out, several counter-trends (some of them having nothing to do with the Internet) seem to have transformed the intellectual landscape. Today, most scientists seem far more eclectic, agile and cross-disciplinary than ever. They seek insights and collaboration far afield from their specialties. Institutions like the Sixth College at the University of California at San Diego deliberately blend the arts and sciences, belying C.P. Snow's "two cultures forever schism'd." Moreover, the spread of avocations and ancillary expertise suggests that the professionalization trend has finally met its match in a looming age of amateurs.)

If anything, our worry has mutated. Instead of fretting about specialists "knowing more and more about less and less," today's info glut has had an inverse effect -- to spread people's attention so widely that they -- in effect -- know just a little about a vast range of topics. No longer do our pessimists fear "narrow-mindedness" as much as "shallow-mindedness."

Indeed, doesn't Carr have a point, viewing today's blogosphere as superficial, facile and often frivolous? When pundit mistress Arianna Huffington crows about there being 50,000 new blogs established every day, calling them a "first draft of history," is that flattering to history?

Don't get me wrong -- I want to believe this story. My own metaphor compares today's frenzy of participatory self-expression to a body's immune system that might sniff out every dark abuse or crime or unexamined mistake. I still believe the age of amateurs has that potential. But, in darker moments, I wonder. If our bodies were this inefficient -- with such an astronomical ratio of silliness to quality -- we'd explode from all the excess white blood cells before ever benefiting from the few that usefully attack an error or disease.

Above all, can you name a problem that all this "discourse" has profoundly or permanently solved -- in a world where problems proliferate and accumulate at a record pace? No, let's make the challenge simpler: Can Shirky or Huffington point to even one stupidity that has been decisively disproved online? Ever?

Sure, there are good things. Professional journalism has added many innovative cadres and layers that show more agility and zest than the old newspapers and broadcasting networks, often aided or driven by a stratum of actinically focused semipro bloggers. I am well aware of -- and grateful to -- the many excellent political and historical fact-checking sites that debunk all kinds of mystical or paranoid nuttiness. But still, the nutty things never go away, do they? Debunking only serves to damp each fever down a little, for a while, but the infections remain. Every last one of them.

What the blogosphere and Facebook cosmos do best is to engender the raw material of productive discourse -- opinion. Massive, pyroclastic flows of opinion. (Including this one.) They can be creative and entrancing. But they are only one-half of a truly creative process.

Such processes that work -- markets, democracy, science -- foster not only the introduction of fresh variety and new things but also a winnowing of those that don't work. Commerce selects for better products and companies. In elections, the truth eventually (if tardily) expels bad leaders. Science corrects or abandons failed theories. It's messy and flawed, but we owe almost everything we have to the way these "accountability arenas" imitate the ultimate creative process, evolution, by not only engendering creativity but also culling unsuccessful mutations. To make room for more.

Today's Web and blogosphere have just one part of this two-stage rhythm. Sure, bullshit makes great fertilizer. But (mixing metaphors a bit) shouldn't there be ways to let pearls rise and noxious stuff go away, like phlogiston and Baal worship? Beyond imagination and creativity and opinion, we also need a dance of Shiva, destroying the insipid, vicious and untrue.

To the nostalgists, there is only one way of accomplishing this -- the 4,000-year-old prescription of hierarchy. But all those censors, priests and credentialed arbiters of taste had their chance, and all our instincts, as children of the Enlightenment, rebel against ever letting them get a grip on culture once again! Authoritarian gatekeepers would only wind up stifling what makes the Net special.

But there is another option. A market could replicate the creative (and creatively destructive) power of evolution in the realm of good and bad ideas, just as our older markets sift and cull myriad goods and services. That is, if the Web offered tools of critical appraisal and discourse. Tools up to the task.

Tentative efforts have been taken to provide this second half of the cycle, where amateur participants might manifest, en masse, the kind of selective judgment that elite gatekeepers and list makers used to provide. Wikipedia makes a real effort. A few debate and disputation sites have tried to foster formal set-tos between groups opposing each other over issues like gun control, experimenting with procedures that might turn debate from a shouting and preening match into a relentless and meticulous exploration of what's true and what's not. But these crude efforts have been given a just scintilla of the attention and investment that go to the fashionable, honey pot concepts: social networks and avatar spaces, "emotional" self-expression, photo/art/film sharing, and ever more ways to gush opinions.

Am I sounding like one of the curmudgeons? Look, I like all that stuff. Heck, I can self-express with the best of 'em. It's how I make my living! But all by itself, it is never, ever going to bring us to a singularity -- or even a culture of relatively effective problem solvers.

Note that my complaint isn't the same as Carr's about our fellow citizens becoming "nekulturny" and losing the ability to read (as in the Walter Tevis novel "Mockingbird.") Sure, I wish (for example) that some of the attention and money devoted to shallow movie sci-fi remakes would turn to the higher form of science fiction, with its nuanced Gedanken experiments about speculative change. But we're in no danger of losing the best mental skills and tools and memes of the past. It's a laughable fret.

What we need to remember is that there is nothing unique about today's quandary. Ever since the arrival of glass lenses and movable type, the amount that each person can see and know has multiplied, with new tools ranging from newspapers and lithographs to steamships and telegraphs, to radio and so on. And every time, conservative nostalgists claimed that normal people could not adapt, that such godlike powers should be reserved to an elite, or perhaps renounced.

Meanwhile, enthusiasts zealously greeted every memory and vision prosthetic -- from the printing press to lending libraries to television -- with hosannas, forecasting an apotheosis of reason and light. In 1894, philanthropist John Jacob Astor wrote a bestselling novel about the year 2001, a future transformed by science, technology, enterprise and human goodwill.

Of course, life can be ironic. Astor died with a famed flourish of noblesse oblige aboard the sinking Titanic -- the first of many garish calamities that began quenching this naive zeal for progress. For a while. And so it has gone, a bipolar see-saw between optimists and pessimists.

In reality, the vision and memory prosthetics brought on consequences that were always far more complicated than either set of idealists expected. Out of all this ruction, just one thing made it possible for us to advance, ensuring that the net effects would be positive. That one thing was the pragmatic mind set of the Enlightenment.

So let's conclude by returning to a core point of the cyber-grumblers. Yes, for sure, some millions, perhaps billions, will become couch or Net potatoes. Unimaginative, fad-following and imitative. But there is a simple answer.

So what? Those people will matter as little tomorrow as couch potatoes who stay glued to television matter today.

Meanwhile, however, a large minority -- the "creative minority" that Toynbee called essential for any civilization's success -- will continue to feel repelled by homogeneity and sameness. They'll seek out the unusual and surprising. Centrifugally driven by a need to be different, they'll nurture hobbies that turn into avocations that transform into niches of profound expertise.

Already we are in an era when no worthwhile skill is ever lost, if it can draw the eye of some small corps of amateurs. Today there are more expert flint-knappers than in the Paleolithic. More sword makers than during the Middle Ages. Vastly more surface area of hobbyist telescopes than instruments owned by all governments and universities, put together. Following the DIY banner of Make magazine, networks of neighbors have started setting up chemical sensors that will weave into hyper-environmental webs. Can you look at all this and see the same species of thoughtless, imitative monkeys that Mark Pesce does?

Well, we are varied. We contain multitudes. And that is the point! Those who relish images of either gloom or apotheosis forget how we got here, through a glorious, wonderful, messy mix of evolution and hard work and brilliance, standing on the shoulders of those who sweated earlier progress. Now we sprint toward success or collapse. If we fail, it will be because we just barely missed a once-in-a-species (perhaps even once-on-a-planet) chance to get it right.

I don't plan to let that happen. Do you?

No, what's needed is not the blithe enthusiasm preached by Ray Kurzweil and Clay Shirky. Nor Nicholas Carr's dyspeptic homesickness. What is called for is a clear-eyed, practical look at what's missing from today's Web. Tools that might help turn quasar levels of gushing opinion into something like discourse, so that several billion people can do more than just express a myriad of rumors and shallow impulses, but test, compare and actually reach some conclusions now and then.

But what matters even more is to step back from yet another tiresome dichotomy, between fizzy enthusiasm and testy nostalgia. Earlier phases of the great Enlightenment experiment managed to do this by taking a wider perspective. By taking nothing for granted.

If we prove incapable of doing this, then maybe those who worry about the latest generation's devolution are right after all.


By David Brin

David Brin is an astrophysicist whose international best-selling novels include "Earth," and recently "Existence." " The Postman" was filmed in 1997. His nonfiction book about the information age - The Transparent Society - won the Freedom of Speech Award of the American Library Association.  (http://www.davidbrin.com)

MORE FROM David Brin


Related Topics ------------------------------------------