Trapped in the grid

Like electricity, the Web is everywhere and changes everything, says Nicholas Carr. But the one thing it can't deliver is freedom.

Published January 24, 2008 12:00PM (EST)

I greeted the millennium as a new parent with a decidedly 21st-century problem. My household had two working parents and a serious need for childcare help, which, thankfully, we could afford. I had barely adjusted to my new identity as a sleep-deprived father of twins, but now the state of California was casting me in an even more surreal role: I'd become a domestic employer, which meant I had to master a thick book of tax regulations, withhold taxes, and file them with the state -- no picnic, even for the well-rested.

As I hunted for help online, I stumbled on a Web site that offered a one-stop shop for people in my quandary. You paid a small fee and entered a little data and they did the rest: Payroll. Tax calculations. Government forms.

That site saved my sanity. And when my kids graduated to preschool and I gratefully retired my domestic-employer status, I wasn't left with a useless investment in an accounting software package; I just canceled.

My experience in 2000 gave me a glimpse of the sea change that was about to engulf the software industry. This change -- from personally or institutionally owned and operated software packages to software services delivered over the Web -- has begun to roll through every manifestation of software in our lives, from keeping a calendar and address book to storing research to enjoying music and movies and TV shows to running small and large businesses. One name for this trend, which encompasses everything from my nanny-tax accountant to Salesforce.com to Google, is "software as a service." Another is "utility computing."

As Monopoly players know, "utility" is one of the unsexiest words in the English language. When something becomes a utility, we mean that it's been demoted from an exciting innovation to a boring piece of forgettable infrastructure.

That is exactly what technology pundit Nicholas Carr told the grandees of Silicon Valley had happened to them in the article that made his name, a 2003 bombshell in the Harvard Business Review titled "IT Doesn't Matter." The computer industry's products, Carr said, had lost the ability to provide companies with a strategic advantage; they had become a mere cost of doing business. Far from being "revolutionary" or "world-changing," they were as dull and interchangeable as a utility. They just didn't matter.

Only now it seems they matter a whole lot again. In his magisterial but flawed new book, "The Big Switch: Rewiring the World, from Edison to Google," Carr predicts that "software as a service" is about to revolutionize the technology industry. Forget Microsoft (and, for that matter, the Mac); the personal-computing era is over. And the transformation of computing into a utility is not merely the next wrinkle in the continuing metamorphic story of digital technology, Carr says, but rather its teleological culmination, its final destiny. (The book's original subtitle was "Our New Digital Destiny," but wiser heads prevailed.) The Internet, having wired the world's machines together into one great "World Wide Computer," will turn computing into a utility grid as ubiquitous as the universal electrical service that arrived a century ago. And we shall be changed.

"The Big Switch" falls neatly into two halves. The first, which I can enthusiastically recommend, draws an elegant and illuminating parallel between the late-19th-century electrification of America and today's computing world. In the less persuasive latter section, Carr surveys the Internet's transformations of our world, and questions whether we should welcome them. His questions are good ones; indeed, any treatment of this subject that failed to explore them couldn't be taken seriously. But in his eagerness to discredit "techno-utopian dreamers" and expound a theory of the Internet as a technology of control, Carr fast-forwards to dour conclusions that his slender argument can't possibly support.

Just as today's corporations run their own data plants, companies in the early years of electricity typically operated their own power plants. Thomas Edison -- like a Bill Gates of the electrical revolution -- raked in profits by supplying parts and licenses. It took a business genius named Samuel Insull to figure out how to deliver electricity in a standardized way over long distances, thereby shaping the electrical system that we now take for granted. Zap! The grid fell into place fast, and hasn't changed that much since. (The wiring in my 1920s home still works fine -- an example of standards longevity that today's digital technologists can only envy.)

The early years of electrification, it turns out, featured a visionary euphoria whose overheated language uncannily resembles the over-the-top fantasies of some of the Web's early theorists. Carr's portrait of electricity's triumph is enthralling; my only regret is that he does not explore in more depth how it was that the single most formative technology of 20th-century America, the automobile, ended up running on what today we'd call an entirely different platform. How did we end up pumping gas instead of recharging batteries -- and might that suggest a more heterogeneous path forward for computing than Carr allows?

In the second half of his book, Carr pulls a big switch himself. Having persuasively sketched the shift to a software-as-a-service world, he trades in his business-historian top hat for a culture critic's beret. As we move to a Web-based system, Carr argues, the transformations will go beyond the financial press's win-loss columns; they will extend to our civic lives, our popular diversions and artistic endeavors, our private engagements and emotions, and even our inner experience of ourselves.

Of course, those changes were under way long before software applications began to migrate to the Web -- and they would arguably still be happening, albeit a bit more slowly, even if we never did plug a gazillion new devices into a "World Wide Computer," and just kept plugging away on our clunky old Web-enabled PCs instead. And so, while the first part of "The Big Switch" could only have been written today, the second part could, with the swapping out of a few company names, easily be mistaken for one of the many books of techno-cult crit that greeted the Web's first coming in the mid-'90s (like those by Sherry Turkle, Clifford Stoll, or David Shenk). In the ranks of such critics, Carr is deft but not, fundamentally, original.

"The Big Switch" provides an intelligent overview of the rise of Web 2.0-style services and reviews a canonical list of examples, from YouTube and Second Life to Craigslist and Amazon's Mechanical Turk. Then Carr runs down the list of usual-suspect dangers of the New Web Order: erosion of privacy, economic dislocation, exploitation of unpaid labor, alienation from real-world communities and disintegration of civic life. (Newspapers are in bad trouble, too.) The charges are comprehensive -- sometimes so comprehensive that they cancel each other out: On one page, for instance, Carr tells us that the Net has been "a godsend for terrorists," who can use it to hide their plots; a few pages later, he declares that the "World Wide Computer" will allow governments to perfect surveillance.

Carr is way too smart a critic not to acknowledge the multitude of on-the-other-hands to his critiques. He knows that many of us love the opportunities the Web provides for self-expression, for connection, and for access to a wider set of cultural choices than the broadcast world could ever deliver. Repeatedly, he acknowledges tensions between the Net's uses as "an instrument of bureaucratic control and of personal liberation, a conduit of communal ideals and of corporate profits." "The resolution of the tensions, for good or ill," he writes, "will determine how the grid's consequences play out over the coming years and decades."

Carr is right: These tensions lie at the Internet's heart. It is a dynamic system in which every move toward centralization, and each step toward individual empowerment, evokes an equal and opposite reaction. But after all his unresolved tensions, his balanced clauses and hedged rhetoric, Carr finally comes clean that he doesn't think the conflict between control and liberation is an even fight at all. "Computer systems," he writes, "are not at their core technologies of emancipation. They are technologies of control. They were designed as tools for monitoring and influencing human behavior, for controlling what people do and how they do it." As a result, he says, every apparently liberating moment that digital technology has provided -- from the excitement of the early PC era to the heady thrill of the nascent Web -- has proved "short-lived." What the chip giveth, the Man soon taketh away.

"The Big Switch," then, is offering a different, and in some ways more sophisticated, analysis than the alarm sounded by writers like Andrew Keen and Lee Siegel, who offer variations on "The barbarians are at the gates!" Carr's power-structural analysis warns us instead that every time the provinces get restless, the emperor will send out his legions -- and resistance is futile.

For Carr, the genesis of information technology as a tool for bureaucratic management, all the way back to the days of Hollerith punch cards, limits its potential for good. Initial conditions determine the story's outcome. But by this reasoning, counting and writing ought to have been similarly hobbled; after all, they were invented so the ancients could manage their grain stores. Yet somehow humanity managed to turn such tools to more diverse ends -- to work out particle physics and write "Hamlet" and "Paradise Lost."

I am not so quick as Carr to assume that the end-state of digital technology can be so easily predicted from its origins. In the face of the accelerating changes the Net promises, ambivalence is a natural and appropriate response -- one neatly encapsulated in the punning title of Michael Wesch's popular video encapsulation of the Web 2.0 world, "The Machine is Us/ing Us." Whether one chooses to view the digital glass as half-full or -empty is, finally, a matter of temperament.

Carr's brain entertains both sides of the issues, but at heart he's plainly a glass-half-empty type. That predilection, I think, limits "The Big Switch's" power: Carr writes with a grim assurance of a hell-in-a-handbasket outcome that his own logic does not support. Yet the book's final pages -- a rumination on how quickly we forget "how things were" once a new technological regime arrives -- sing such a lovely elegy that I almost forgot all the arguments I'd had with the preceding chapters.

Perhaps the initial conditions of Carr's own career, as an author of business-school studies, are what hem in his judgments. His gimlet eye lingers in the Web's corporate precincts, and has little time to spare for the teeming, unpredictable, often frivolous, occasionally beautiful Web that's being built, in tiny increments, by millions of individuals. Carr is an adept blogger -- his Rough Type posts display a sense of fun that only infrequently leavens "The Big Switch" -- and he seems to understand why people like to contribute to Web-based projects: "...because they enjoy it. It gives them satisfaction." But then he goes on to describe the bottom-up world of creativity on the Net as merely "new forms of the pastimes and charitable work that people have always engaged in outside their paid jobs."

To Carr, each choice we make, every day, to write on our blogs or post photos and videos or share reviews of products or contribute to Wikipedia is ultimately the act of a dupe. However good our self-expression makes us feel, we're really just creating value to be skimmed by Silicon Valley's new plutocrats.

That's indeed part of the story -- but if it were all of it, then surely the whole "user-generated" Web would disintegrate the moment we saw through the scam. Carr doesn't predict that, and neither does anyone else. In fact, economic analysis alone can't explain what's happening on the Web; neither can technological determinism.

I can't fully explain it myself. But I think the piece of the picture that Carr leaves out may be the one that longtime Internet observer Clay Shirky was talking about last year, in a speech about the fruits of open collaboration online. "In the past," Shirky said, "we could do little things for love. But big things required money. Now, we can do big things for love."


By Scott Rosenberg

Salon co-founder Scott Rosenberg is director of MediaBugs.org. He is the author of "Say Everything" and Dreaming in Code and blogs at Wordyard.com.

MORE FROM Scott Rosenberg


Related Topics ------------------------------------------

Books Computers Economics Nonfiction U.s. Economy