For an author of serious nonfiction, success can lead to some surprisingly disheartening encounters with the reading public. Susan Jacoby’s 2004 history of American secularism, “Freethinkers,” was among the first in the recent wave of welcome books protesting the growing influence of religion in civic life, and universities and other institutions soon began asking her to deliver lectures. Jacoby jumped at the chance, only to find that wherever she spoke, “my audiences were composed almost entirely of people who already agreed with me.” Instead of participating in the great public debate that she envisions as central to American culture, she was preaching to the choir. What’s more, she learned, “serious conservatives report exactly the same experience on the lecture circuit.”
A couple of years later, put up in a student dormitory after giving another talk, she found her environs “eerily quiet.” Gone were the “high level of noise and laughter,” the “late-night and all-night” conversations she remembered from her own undergraduate years. Instead, everybody was “on line or in an iPod cocoon.” To top it all off, when she was invited back to her alma mater, Michigan State University, to receive an honorary award, she struck up a conversation with an honors student in the College of Communications Arts, only to find that the young woman had never even heard of Franklin D. Roosevelt’s fireside chats. Apparently, even when students felt like talking they didn’t know enough about their own disciplines to be worth talking to.
Such are the little disillusionments that vex a public intellectual’s soul. Furthermore, as Jacoby sees it, they are telling the same story as those shocking polls that show most Americans can’t list the rights guaranteed by the First Amendment or find Iraq on a map. All of it confirms her suspicion that “the scales of American history have shifted heavily against the vibrant and varied intellectual life so essential to a functioning democracy.”
Richard Hofstadter’s 1963 classic, “Anti-Intellectualism in American Life” (a clear inspiration for this book), described anti-intellectualism as “older than our national identity” and deeply rooted in our history. Jacoby thinks the old American distrust of those who devote themselves to “ideas, reason, logic, evidence, and precise language” has been worsened by the conditions of contemporary life. There is, she writes, “a new species of semi-conscious anti-rationalism, feeding on and fed by an ignorant popular culture of video images and unremitting noise that leaves no room for contemplation or logic.” People never read books, they can’t concentrate on anything significant for more than a minute or two, and as a result they don’t really think anymore. Lulled by the “pacifier” of “infotainment,” their civic and political decisions emerge from a confused welter of laziness, reckless emotion and prejudice.
The chief manifestations of this newly virulent irrationality are the rise of fundamentalist religion and the flourishing of junk science and other forms of what Jacoby calls “junk thought.” The mentally enfeebled American public can now be easily manipulated by flimsy symbolism, whether it’s George W. Bush‘s bumbling, accented speaking style (labeling him as a “regular guy” despite his highly privileged background) or the successful campaign by right-wing ideologues to smear liberals as snooty “elites.” Unable to grasp even the basic principles of statistics or the scientific method, Americans gullibly buy into a cornucopia of bogus notions, from recovered memory syndrome to intelligent design to the anti-vaccination movement.
“The Age of American Unreason” veers unevenly between well-argued debunkings of assorted crackpot claims and litanies of gripes that come dangerously close to diatribes. A former reporter for the Washington Post and program director of the Center for Inquiry-New York City, a rationalist think tank, Jacoby can certainly formulate concise ripostes to the likes of former Harvard president Lawrence H. Summers, who suggested that the underrepresentation of women among the top ranks of the hard sciences must be due to innate gender traits. “What places Summers’ speculative statements within the realm of junk thought,” she writes, “is not the idea that there might be some differences in aptitude between men and women but his unsupported conclusion that such disparities if they exist, are more important than the very different cultural messages girls and boys receive about whether they can expect to succeed in science.”
Jacoby takes care to point out that the political right and left have both indulged in anti-rationalism, and this is one of her book’s strengths. Intellectuals themselves often come in for a drubbing at her hands. She reproaches those deluded American leftists who defended Soviet communism in the 1930s and ’40s, long after it had become obvious that Stalin and his successors presided over a brutally oppressive regime. (She also points out that the influence of such figures on American culture at large has been vastly overstated by both their friends and their enemies.) She quotes goofy feminist theorists from the 1980s, academics who likened Isaac Newton’s laws of mechanics to a “rape manual” and called Beethoven’s Ninth Symphony “horrifyingly violent.”
Although Jacoby scolds culture warriors like Allan Bloom, author of “The Closing of the American Mind,” for both misunderstanding and misrepresenting the upheavals on American campuses during the 1960s and ’70s, she also deplores many of the leftist remedies for those conflicts. Women’s and African-American studies departments, she argues, only “ghettoize” the subject matter they champion, and further Balkanize and provinicalize university students. Not coincidentally, the creation of those departments generated more faculty jobs without pressuring traditional professors to reassess their curricula: “Too many white professors today could not care less whether most white students are exposed to black American writers, and some of the multicultural empire builders are equally willing to sign off on a curriculum for African-American studies majors that does not expose them to Henry James and Edith Wharton.”
Jacoby, who covered education for the Post during the ’60s, sees herself as a “a cultural conservationist, committed, in the strict dictionary sense, to the preservation of culture.” She believes in the classics (whether of literature, music or the visual arts), and at the same time sees no reason why they can’t be expanded to include great works by people (women and racial minorities) previously excluded from the canon. It’s not a zero-sum game. Hers is a moderate, sensible, well-founded position, shared by many Americans, yet it somehow rarely got voiced amid the raging hyperbole of the culture wars.
Fundamentalism, however, is the real red-hot center of American irrationality, and Jacoby calls religious assaults on the theory of evolution “a microcosm of all the cultural forces responsible for the prevalence of unreason in American society today.” She notes that in the summer of 2005 nearly two-thirds of Americans told pollsters that they believed creationism should be taught in schools alongside Darwinian evolution. The poll revealed what Jacoby characterizes as “an intellectual disaster as grave as the human and natural disaster unfolding in New Orleans” at the same time.
It’s hard to quarrel with her on that one, and compounding the mess is the fact that most Americans don’t even understand the religion they want to see defended: “A majority of adults, in what is supposedly the most religious nation in the developed world, cannot name the four Gospels or identify Genesis as the first book of the Bible.” For me, this startling information immediately brought to mind Stephen Colbert’s interview with Georgia Rep. Lynn Westmoreland on “The Colbert Report.” Westmoreland co-sponsored a bill that would require the display of the Ten Commandments in both the House of Representatives and the Senate, but, when asked, couldn’t actually list the commandments he’s fighting to enshrine. (Of course, the bill, a flagrant violation of the separation of church and state, was proposed only so that it could be shot down, thereby fueling the Christian right’s preposterous claims of persecution.)
Jacoby would no doubt find it depressing that I should be reminded of a TV show at this juncture. She sees the triumph of video-based culture as the primary source of America’s mental decay. Yes, she admits that “screen media” has “many intellectually useful components,” but believes that “it is on balance an unfriendly habitat not only for serious high-level intellectual endeavor but also for the more ordinary exchanges of ideas that enliven and elevate culture at every level.” The more people watch — or IM, or e-mail, or play video games — the less they read and converse.
This argument, which soon bogs down in too-sweeping indictments of TV and the Internet, is the weakest portion of “The Age of American Unreason,” and Jacoby (to her credit) knows it. She laments the fact that “aggrieved eulogies for print culture” have become so commonplace that no one pays them any mind. Still, a writer who has just come from ridiculing Diana Trilling and David Brooks for ludicrously exaggerating the influence of old left intellectuals ought to know better than to write a sentence like: “It has now become more insulting to call someone a Luddite than to call her a cheat, a drug addict, or a slut.”
I don’t entirely disagree with Jacoby on many of these points. As a literary critic, I too worry about the dwindling numbers of Americans who read for pleasure. Furthermore, like Jacoby (and Caleb Crain, in a recent New Yorker article about the prospect of a “post-literate” America), I believe that reading fosters a particular mental stamina, discipline, creativity and flexibility that can’t be acquired from other media. In a future dominated by complex social systems, technology and science, only people who can think in this fashion will have enough understanding of how the world works to actually run it. And to remain truly democratic, America should be made up of citizens who are able to think that way.
Nevertheless, Jacoby has a hard time separating her legitimate worries about America’s eroding attention span from simple disagreements of taste and generational preferences. She dismisses certain forms of popular art out of hand, automatically presuming that her readers will agree. But I, for one, see no reason why newspaper articles on “the newest trends in hip-hop” should be written off as no more than craven pandering to distractible young readers; the subject is interesting, and worthy, in its own right. I might not equate Bob Dylan with Milton, as some overzealous rock critics have apparently done, but I’m also aware that the pop fluff of one era (the operas of Puccini, for example) often becomes the classical repertoire of the next. When Jacoby hauls out that old, shopworn story about crowds gathering at the docks to grab the latest installment of a Dickens novel, she’s not accounting for the fact that Dickens had about the same artistic status in his day as the creators of “The Sopranos” have in ours — and I’m not sure that the Dickens novel in question (“The Old Curiosity Shop”) emerges as the better work in the comparison.
Recognizing the merits of James Baldwin need not detract from the admiration due to Shakespeare, a sentiment Jacoby herself would probably second. Just so, valuing the wit of “The Colbert Report” or the intricacies of “The Wire” doesn’t automatically imply a depreciation of “War and Peace.” Each form has its own artistry and makes its own demands on creator and audience, and new forms (such as the novel in the 18th century) arise to speak to new audiences in a new way. Jacoby rightly considers Al Gore to be an example of the sort of serious, studious public figure who gets unfairly written off as “arrogant and patronizing” in the dumbed-down politics of today. But she carefully avoids acknowledging that Gore managed to break through the public’s indifference to the issue of global warming by figuring out how to present his ideas visually, first in a PowerPoint presentation, and later in a movie, demonstrating that it’s possible to do justice to complex issues in those forms.
Likewise, Colbert’s deft dispatch of Westmoreland (getting the man to betray himself, no less) conveys the sleaziness of this type of “values” crusader more persuasively and conclusively than any number of written jeremiads by left-wing commentators. Is a top-drawer television series like “30 Rock” automatically a lesser creation than, say, the live performance of a play by Richard Brinsley Sheridan, simply because the first is TV and the second is theater, the first filmed and broadcast, and the second written by a man who wore a powdered wig? You can waste hours parsing the relative greatness of various artworks in different media, and that would be time much better spent watching “Project Runway.”
The real problem with TV, and to a lesser extent the Internet, is that while some of it is excellent, much of it is not — and all of it has become ubiquitous. As Jacoby astutely points out, reading does not “constitute a continuous invasion of individual thought and consciousness … printed works do not take up mental space simply by virtue of being there; attention must be paid or their content, whether simple or complex, can never be truly assimilated.” Unless you make a point of turning off the TV and putting the computer to sleep, they can easily fill up your day and mind, gradually atrophying the mental muscles uniquely exercised by reading. Abstaining, for many people, turns out to be as easy as bypassing a cupboard stocked with chips and cookies and snacking on carrot sticks instead. To hope that the American public will pick the nutritious but difficult over the easy and tasty is to bet on a losing horse.
No wonder that the concluding chapter of Jacoby’s book is so gloomy. All we have to count on, she writes, is the fortitude of “parents and citizens determined to preserve a saving remnant of those who prize memory and true learning above all else. Adult self control, not digital parental controls, is the chief requirement for the transmission of individual and historical memory” to the next generation.
But wait — is it really that bad? The current crop of leading presidential candidates not only aren’t dumb, they aren’t even trying to appear dumb, apart from occasionally droppin’ their Gs. The only candidate who professes not to believe in evolution, Mike Huckabee, has fallen behind a front-runner, John McCain, who is widely disliked by fundamentalists.
If TV and the Internet (more prevalent now than ever) are supposedly rotting our brains, turning them into mulch for the poisonous growth that is fundamentalism, what seems to have reversed that trend, even if only temporarily? Perhaps we’re better able to assess the “reality-based” consequences of putting a fundamentalist in the White House than we once appeared to be. The problem is, when push comes to shove, we don’t always feel like facing reality.
The missing factor in Jacoby’s formula is just that: In addition to being capable of rationality, we also have to want to be rational. Intellect, copious reading and education by themselves are no guarantee of reasonable or even sensible behavior, as the neo-conservative true believers responsible for the Iraq war have amply demonstrated. Yet this is one aspect of American religiosity that doesn’t seem to interest Jacoby much. In considering the Second Great Awakening, the outburst of religious revivalism that swept through the nation in the early 19th century, she kicks around some possible causes (the “unsettled” social conditions following the Revolution, the difficulties of life on the frontiers, etc.) in a desultory fashion. Then she writes, “in any event, the reasons why fundamentalism triumphed over ‘rational’ religion in the American spiritual bazaar are less important than the fact that fundamentalism did succeed in capturing the hearts of large numbers of Americans.”
It’s hard to imagine what could be more central to Jacoby’s subject than the motivations of those Americans who chose what she describes as “willed ignorance” over reason. Isn’t it likely that the recent resurgence of that ignorance arises from similar needs and desires? If there were some other way to address those needs (or fears), perhaps fundamentalism would be less appealing, and perhaps reason could be made more so. However, that would require admitting that people who are capable of reason will nevertheless sometimes pick an irrational course of action or belief. Rational people do this all the time, of course — even intellectuals. But rationality has its own ideology, and one of its tenets is the conviction that, if given a fair chance, reason must always carry the day.
If you believe that, then you can only arrive at one conclusion, Jacoby’s: It’s not that Americans won’t be rational, but that we can’t. There’s enough evidence of our poor schooling, susceptibility to pseudoscientific hucksterism and general cluelessness to justify that opinion, to be sure. But if that’s really the case, then it really is down to a few brave souls, committing to a doomed battle to preserve that “saving remnant” and fighting the dying of the light.