Why can't we concentrate?

Twitter and e-mail aren't making us stupider, but they are making us more distracted. A new book explains why learning to focus is the key to living better.

Published April 29, 2009 10:55AM (EDT)

Here's a fail-safe topic when making conversation with everyone from cab drivers to grad students to cousins in the construction trade: Mention the fact that you're finding it harder and harder to concentrate lately. The complaint appears to be universal, yet everyone blames it on some personal factor: having a baby, starting a new job, turning 50, having to use a Blackberry for work, getting on Facebook, and so on. Even more pervasive than Betty Friedan's famous "problem that has no name," this creeping distractibility and the technology that presumably causes it has inspired such cris de coeur as Nicholas Carr's much-discussed "Is Google Making Us Stupid?" essay for the Atlantic Monthly and diatribes like "The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future," a book published last year by Mark Bauerlein.

You don't have to agree that "we" are getting stupider, or that today's youth are going to hell in a handbasket (by gum!) to mourn the withering away of the ability to think about one thing for a prolonged period of time. Carr (whose argument was grievously mislabeled by the Atlantic's headline writers as a salvo against the ubiquitous search engine) reported feeling the change "most strongly" while he was reading. "Immersing myself in a book or a lengthy article used to be easy," he wrote. "Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text." For my own part, I now find it challenging to sit still on my sofa through the length of a feature film. The urge to, for example, jump up and check the IMDB filmography of a supporting actor is well-nigh irresistible, and once I'm at the computer, why not check e-mail? Most of the time, I'll wind up pausing the DVD player before the end of the movie and telling myself I'll watch the rest tomorrow.

This is no mere Luddite's lament. A couple of years ago a craze for "full screen mode" writing software like WriteSpace and WriteRoom swept through the Web's various digital communities devoted to productivity tips and tricks favored by technology workers. These applications reduce a computer's display to a simple black screen with a column of text running down the middle. My colleague Rebecca Traister wrote recently of her love affair with Freedom, a program that locks her computer off the Internet for a preset block of time so she can "get some goddamn work done," a desperate measure she characterized as a bid to "protect me from myself."

What this commonplace crisis comes down to is our inability to control our own minds. You may, like Traister, need to buckle down and write, or you may, like Carr, pine for the deeply engaged style of reading we bring to books and New Yorker profiles. You may, like me, realize that your evening will be more enjoyable and more enriching if you commit to the full 110 minutes of "Children of Men" instead of obsessively checking out your friends' Facebook updates or surveying borderline illiterate reader reviews -- or, for that matter, browsing through the "Seinfeld" reruns in your Tivo Suggestions queue. In many cases, the thing we wish we would do is not only more interesting but ultimately more fun than the things we do instead, and yet it seems to require a Herculean effort to make ourselves do it.

What to do? For most people, bailing on the Web or e-mail or cellphones isn't even feasible, let alone practical or ultimately desirable. (I shudder at the thought of living without my beloved Tivo.) Besides, modern life really isn't making us stupider: IQ tests have to be regularly updated to make them harder; otherwise the average score would have climbed 3 percent per decade since the early 1930s. (The average score is supposed to remain at a constant 100 points.) And IQ measures problem-solving ability, rather than sheer data retained, which has grown even faster over the same interval. Each of us knows many more people and facts than our counterparts of 100 years ago; it's just that the importance of those people and facts remains somewhat uncertain. Knowing a little bit about Lindsay Lohan and Simon Cowell (two people I recognize despite having no active interest in either one) can't really be equated with knowing a bit about Marie Curie or Lord Mountbatten. We have more information, but it isn't necessarily more valuable information.

Winifred Gallagher's new book, "Rapt: Attention and the Focused Life" argues that it's high time we take more deliberate control of this stuff. "The skillful management of attention," she writes, "is the sine qua non of the good life and the key to improving virtually every aspect of your experience, from mood to productivity to relationships." Because we can only attend to a tiny portion of the sensory cacophony around us, the elements we choose to focus on -- the very stuff of our reality -- is a creation, adeptly edited, providing us with a workable but highly selective version of the world and our own existence. Your very self, "stored in your memory," is the product of what you pay attention to, since you can't remember what you never noticed to begin with.

Gallagher came to appreciate this while fighting "a particularly nasty, fairly advanced" form of cancer. Determined not to let her illness "monopolize" her attention, she made a conscious choice to look "toward whatever seemed meaningful, productive, or energizing and away from the destructive, or dispiriting." Her experience of the world was transformed. This revelation naturally led her to wonder why she'd had to exert herself to do what made her feel better. Why didn't she turn to it as naturally as a thirsty woman turns to a glass of ice water? Why do we reflexively award more attention to negative or toxic phenomena like disasters and insults, while neglecting to credit small pleasures and compliments with the significance they deserve?

 A good part of "Rapt" explores this puzzle, identifying both biological and cultural causes for our sometimes self-defeating habits. The book belongs to a school of nonfiction -- Malcolm Gladwell's "The Tipping Point" is the model -- that aims to walk the line between social science and self-help. Despite the title, disappointingly little of "Rapt" is concerned with the state Gallagher describes as "completely absorbed, engrossed, fascinated, perhaps even 'carried away,'" that is, precisely the experience Carr thinks is becoming ever more inaccessible. Ironically, for a book about focusing, "Rapt" can be frustratingly scattered, self-contradicting and platitudinous; do we really need more hand-wringing about families who don't have dinner together or reheated summaries of scientific studies demonstrating the power of positive thinking?

Still, Gallagher deserves credit for calling our attention to attention itself, specifically to the way it works neurologically. In essence, attention is the faculty by which the mind selects and then zeroes in on the most "salient" aspect of any situation. The problem is that the brain is not a unified whole, but a collection of "systems" that often come into conflict with each other. When that happens, the more primitive, stimulus-driven, unconscious systems (the "reactive" and "behavioral" components of our brains) will usually override the consciously controlled "reflective" mind.

There are excellent reasons for this. In the conditions under which humanity evolved, threats had the greatest salience; individuals who spotted and eluded dangers before they went chasing after rewards tended to live long enough to pass on their traits to future generations. As a result, we inherited from our distant ancestors the tendency to pay greater attention to the unpleasant and troublesome elements of our surroundings, even when those elements have evolved from real menaces, like a crocodile in the reeds, to largely insignificant ones like nasty anonymous postings in a Web discussion.

Likewise, our interest is grabbed by movement, bright colors, loud noises and novelty -- all qualities associated with potential meals or threats in a natural setting; we are hard-wired to like the shiny. The attention we bring to bear on less exciting objects and activities, where the payoff may be long-term rather than immediate, requires a conscious choice. This is the kind of attention that opens into complex, nuanced and creative thought, but it tends to get swamped by the more urgent demands of the reactive system unless we exert ourselves to overcome our instincts. The reflective system flourishes best when the environment is relatively free of bells and whistles screaming "Delicious fruit up here!" or "Large animal approaching over there!"

The conditions conducive to deep thought have become increasingly rare in our highly mediated lives. When the physical limitations on print and broadcast media kept the number of competitors for our attention relatively few, some candidates could afford to appeal to our reflective side. Now we live in an attention economy, where the most in-demand commodity is "eyeballs." As more options crowd the menu, direct appeals to the reactive mind in the form of bright colors or allusions to sex, aggression, tasty foods and so on, take over.

The machinations of late capitalism aren't the only things driving the incessant pinging on our reactive attention systems, either. If you're like most people, you will keep checking for new e-mail despite the unresolved messages that await in your inbox. The already-read messages may even deal with urgent matters like an impatient question from your boss or appealing subjects like possible vacation rentals, yet there's something lackluster about them compared to what might be wending its way to you over the Internet right this minute. Despite the fact that the incoming messages are probably not any more compelling than the ones you've already received, they're more attention-grabbing simply by virtue of being new. When Carr complains of the compulsion to skim and move on that possesses readers of online media, a major culprit is this instinct-driven craving for the novelties that lurk a mere mouse-click away.

The fact that sensationalism sells is hardly news, but less well-known is the fact that a constant diet of reactive-system stimuli has the potential to alter our very brains. The plasticity of the brain, scientists concur, is much greater than was once thought. New brain-imaging technologies have demonstrated that people consistently called upon to use one aspect of their mental toolbox -- the famously well-oriented London cabbies, for example -- show enhanced blood flow to and development of those parts of the brain devoted to, say, spatial cognition. In "The Overflowing Brain: Information Overload and the Limits of Working Memory," Torkel Klingberg, a Swedish professor of cognitive neuroscience, argues that careful management and training of our working memory (which deals with immediate tasks and the information pertaining to them) can increase its capacity -- that your data-crammed noggin can essentially build itself an annex.

But while it's one thing to accommodate more information, it's another to engage with it fundamentally, in a way that allows us to perceive underlying patterns and to take concepts apart so that we can put them back together in new and constructive ways. The early human who was constantly fending off leopards or plucking low-hanging mangoes never got around to figuring out how to build a house. Because leopards and mangoes were for the most part relatively few and far between, most of our ancestors found it easier to summon the kind of attention conducive to completing projects that, in the long term, make life measurably better. Ironically, while immediate threats and fleeting treats are comparatively much rarer in our complex social world, the attention system designed to deal with them has been kept on perpetual alert by both design and happenstance.

As long as we remain only dimly aware of the dueling attention systems within us, the reactive will continue to win out over the reflective. We'll focus on discussion-board trolls, dancing refinancing ads, Hollywood gossip and tweets rather than on that enlightening but lengthy article about the economy or the novel or film that has the potential to ravish our souls. Tracking the shiny is so much easier than digging for gold! Over time, our brains will adapt themselves to these activities and find it more and more difficult to switch gears. Gallagher's exhortations to scrutinize and redirect our attention could not be more timely, but actually accomplishing such a feat increasingly feels beyond our control. I can't speak personally to the effectiveness of meditation, Gallagher's recommended remedy for chronic distraction, but the effectiveness of meditative practices (religious or secular) in reshaping the brain have also been abundantly demonstrated.

Knee-jerk Internet boosters like to argue that the old ways of thinking are both obsolete and less wondrous than fuddy-duddies make them out to be. The next generation of citizens, they insist, will happily inhabit a culture composed of millions of small, spinning, sparkly bits and, what's more, they will thrive in it. Tell that to the kids who spent all weekend holed up with the last Harry Potter book. As exhausting as it can be to fight off the siren call of the reactive attention system, some part of us will always yearn to be immersed, captivated and entranced by just one thing, to the point that the world and all its dancing diversions grows dim, fades and falls away.


By Laura Miller

Laura Miller is the author of "The Magician's Book: A Skeptic's Adventures in Narnia."

MORE FROM Laura Miller


Related Topics ------------------------------------------

Books Nonfiction Twitter