Like little stars.
Two years ago, Nicholas Carr, a technology writer, published an essay titled “Is Google Making Us Stupid?” in the Atlantic Monthly magazine. Despite being saddled with a grabby but not very accurate headline (the defendant was the Internet itself, not just its most popular search engine), the piece proved to be one of those rare texts that condense and articulate a fog of seemingly idiosyncratic worries into an urgently discussed issue in contemporary life.
It turned out that a whole lot of people were just then realizing that, like Carr, they had lost their ability to fully concentrate on long, thoughtful written works. “I get fidgety, lose the thread, begin looking for something else to do,” Carr wrote. “I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.” At first assuming that his fractured mental state was the result of “middle-age mind rot,” Carr eventually concluded that his heavy Internet usage was to blame. His article about this realization instantly rose to the top of the “most-read” list on the Atlantic’s website and stayed there for months.
“The Shallows: What the Internet Is Doing to Our Brains” is Carr’s new, book-length version of the Atlantic piece. It expands on the points he made in 2008, but it addresses some of the responses he got, as well. In addition to the usual moronic japes (“This article is too long!” — can anyone really be witless enough to believe that joke is clever?), commenters, bloggers and pundits asked if Carr wasn’t confusing the medium with how people choose to use it. Still others dared to argue that the value of what Carr calls “literary reading” has been inflated.
While “The Shallows” does contain significant chunks of the Atlantic essay, this isn’t one of those all-too-familiar annoyances, the book that should have remained an article. In the brief period between the writing of the original piece and the publication of “The Shallows,” neuroscientists have performed and reviewed important studies on the effects of multitasking, hyperlinks, multimedia and other information-age innovations on human brain function, all of which add empirical heft to Carr’s arguments.
The results are not cheering, and the two chapters in which Carr details them are, to my mind, the book’s payload. This evidence — that even the microseconds of decision-making attention demanded by hyperlinks saps cognitive power from the reading process, that multiple sensory inputs severely degrade memory retention, that overloading the limited capacity of our short-term memory hampers our ability to lay down long-term memories — is enough to make you want to run right out and buy Internet-blocking software.
Above all, Carr points to the past 20-some years of neurological research indicating that the human brain is, in the words of one scientific pioneer, “massively plastic” — that is, much like our muscles, it can be substantially changed and developed by what we do with it. In a study that is quickly becoming as popular a touchstone as the Milgram experiment, the brains of London cab drivers were discovered to be much larger in the posterior hippocampus (the part of the brain devoted to spatial representations of one’s surroundings) than was the case with a control group. These masses of neurons are the physiological manifestation of “the Knowledge,” the cabbies’ legendary mastery of the city’s geography. The drivers’ anterior hippocampus, which manages certain memory tasks, is correspondingly smaller. There’s only so much space inside a skull, after all.
References to the cabbie study don’t often mention this evidence that cognitive development may be a zero-sum game. The more of your brain you allocate to browsing, skimming, surfing and the incessant, low-grade decision-making characteristic of using the Web, the more puny and flaccid become the sectors devoted to “deep” thought. Furthermore, as Carr recently explained in a talk at the Los Angeles Times Festival of Books, distractibility is part of our genetic inheritance, a survival trait in the wild: “It’s hard for us to pay attention,” he said. “It goes against the native orientation of our minds.”
Concentrated, linear thought doesn’t come naturally to us, and the Web, with its countless spinning, dancing, blinking, multicolored and goodie-filled margins, tempts us away from it. (E-mail, that constant influx of the social acknowledgment craved by our monkey brains, may pose an even more potent diversion.) “It’s possible to think deeply while surfing the Net,” Carr writes, “but that’s not the type of thinking the technology encourages or rewards.” Instead, it tends to transform us into “lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.”
A good portion of “The Shallows” is devoted to persuading readers of the truth in Marshall McLuhan’s famous pronouncement, “The medium is the message.” This includes potted histories of such mind-altering “intellectual technologies” as the map, the clock and the printed book. To anyone moderately versed in this history, it may feel unnecessary, but the response to Carr’s original article suggests that many people remain perilously sanguine about our ability to control the technology in our lives.
“The Shallows” certainly isn’t the first examination of this subject, but it’s more lucid, concise and pertinent than similar works by Winifred Gallagher and Sven Birkerts. Carr presents far more scientific material than those writers do, and avoids both the misty Spenglerian melancholia of Birkerts and Gallagher’s muddled efforts to inject Buddhist spirituality into the debate.
What the book doesn’t do, unfortunately, is offer a sufficient rejoinder to Carr’s most puckish critics, people like Clay Shirky, who responded to one Web addict’s complaint that he “can’t read ‘War and Peace’ anymore” by proclaiming Tolstoy’s epic novel to be “too long and not so interesting.” While Shirky was no doubt playing the provocateur, he speaks for a very real anti-authoritarian cultural impulse to dismiss the judgments of experts, of history, even of a majority of other readers when they clash with the (often half-baked) evaluations of the individual. Shirky effectively asserted that, as far as Tolstoy is concerned, the emperor has no clothes — at least not by the standards of today’s multitasking digital natives. And why shouldn’t their opinions be just as valid as anyone else’s?
Carr sensibly replies that anyone who lacks the time or the cognitive “facility” to read a long novel like “War and Peace” will naturally find it too long and not so interesting. But in that case, how would we persuade such a person that it’s worth learning how? For someone like Carr, the value of the intimate, intellectually nourishing practice of “literary reading” (and by extension, literary thinking) may be self-evident. Yet he’s able to quote apparently intelligent and well-educated sources (including a Rhodes scholar who claims to never read books) who simply don’t agree.
While “The Shallows” does present a good case for the richness of organic, biological memory over the crude information storage of digital media, I would have appreciated a more concerted effort to show the advantages of linear thinking over the scattered, skittering, browsing mind-set fostered by the Internet. What will we lose if (when?) this mode of thought passes into obscurity?
Carr and I (and perhaps you) may know that reading “War and Peace” can be a far more profound experience than navigating through a galaxy of up-to-date blog postings, but to someone who can’t or won’t believe this, what else can we point to as a consequence of the withering of such a skill? What will we lose socially, politically, civilly, scientifically, psychologically, if a majority decides that the intellectual “shallows” are the proper habitat for the 21st-century mind? This needs to be spelled out because as Carr’s critics have demonstrated, fewer and fewer people take it for granted. But with that caveat, “The Shallows” remains an essential, accessible dispatch about how we think now.
Note: The conventions of Web journalism dictate that I post links to related stories — such as Carr’s original piece, “Is Google Making Us Stupid?”; a definition of the Milgram experiment; an earlier Salon review of Winifred Gallagher’s book, “Rapt”; a Salon essay by Rebecca Traister about Internet-blocking software; Sven Birkerts’ essay “Reading in the Digital Age”; and Clay Shirky’s Tolstoy-negative response to Carr’s essay — in the text of my review. But Carr argues that such links are at the very least subconsciously distracting, causing you to think, however briefly, “Should I leave this and come back, or just go on?” So I’m including them here and encourage readers to leave a comment on which format they prefer.
Like little stars.
World's best pie apple. Essential for Tarte Tatin. Has five prominent ribs.
So pretty. So early. So ephemeral. Tastes like strawberry candy (slightly).
My personal fave. Ultra-crisp. Graham cracker flavor. Should be famous. Isn't.
High flavored with notes of blood orange and allspice. Very rare.
Jefferson's favorite. The best all-purpose American apple.
New Hampshire's native son has a grizzled appearance and a strangely addictive curry flavor. Very, very rare.
Makes the best hard cider in America. Soon to be famous.
Freak seedling found in an Oregon field in the '60s has pink flesh and a fragrant strawberry snap. Makes a killer rose cider.
Ben Franklin's favorite. Queen Victoria's favorite. Only apple native to NYC.
Really does taste like pineapple.