A bland antidote for Bill ‘n’ Al fatigue: George W.

Clinton debased the presidency and Gore became a hysterical chameleon. A lazy Bush may be just the prescription America needs.

Topics: Hillary Rodham Clinton, George W. Bush, Bill Clinton, Al Gore,

After last fall’s rancorous post-election stalemate, the numbed Republican victors have been moving through the transition toward this week’s inauguration with all the beauty of a mudslide, while Democratic special-interest groups yammer and yap like ravenous hellhounds. President-elect George W. Bush still seems tentative and oddly depressed, while the outgoing president, Bill Clinton, bounces around the country in undignified overdrive. I’m counting the minutes until Clinton, who debased the presidency and tore the country apart, leaves office.

If only one had the exhilarating sense of new beginnings that normally comes with a changing of the guard. But out of some strange psychological stagnancy, Bush has lazily surrounded himself with advisors and appointees from long-gone Republican administrations. It’s baffling why someone who urgently needs to prove to the world that he has a political identity separate from that of his president father wouldn’t make a more vigorous effort to bring in fresh blood. Bush has simply played into the hands of critics who claim he wasn’t ready for the presidency. Was his web of close personal and professional associations really that thin? And it’s dismaying that in this age of communications the president-elect has thus far failed to meet an elementary standard of articulateness for public figures.

But a bland, bumbling Bush may be better for this country than the hysterical chameleon and monstrous panderer that Democratic nominee Al Gore turned into last year. Given the upsurge in partisan warfare and racial animosity fomented in Florida by Democratic operatives after the election, I wish history could be rewritten: if only we could return to the height of the Monica Lewinsky crisis in 1998 and this time firmly force Clinton out. The Democratic establishment was cowardly and irresponsible in backing off from insisting that Clinton resign. The nation would have been spared two horrendous years of inquests, divisiveness and legislative paralysis.



Furthermore, Vice President Al Gore could have assumed the presidency before being overwhelmed by a national campaign and unraveling before our eyes. Had he risen to the presidency by default in 1998, Gore would have gained in stature and experience in the job and, without the burden of the Clinton scandals, might have been easily reelected. It was Gore’s own bizarrely frantic behavior and gross fabrications on the stump that eventually repelled me and many other Democrats who bolted to Ralph Nader. The Democratic leadership has only itself to blame for Bush’s election.

The Usual Suspects of the p.c. era of the 1980s are cranking themselves up into high dudgeon again over Bush’s Cabinet nominees. One had hoped that, with the rise of libertarianism in the 1990s, we had blessedly evolved away from the sterile polarization of left versus right. Are the warhorse feminazis beating the shrubbery to rake up another pasteboard Anita Hill to ambush Bush’s nominee for attorney general, John Ashcroft? If so, stay tuned for a replay of the poisonous psychodrama where race is used as a cynical cover for the real liberal monomania, abortion — as if the entire universe revolves around a single issue affecting the private conduct and personal convenience of heterosexual Western women.

As a pro-choice member of Planned Parenthood, I detest the way the abortion-rights crusade has crippled the women’s movement and distorted American politics because of the fanaticism of feminist leaders who are unembarrassed agents of the Democratic party. As I have repeatedly argued, feminism worldwide limits and damages itself when it ties itself to one political party.

If the current face of the Republican party is dreary, the face of the Democratic party is just plain ugly: ranters crying fraud in Florida but ignoring rampant irregularities in Democratic districts in big cities elsewhere; media flacks calling for counting every vote but overlooking massively uncounted absentee ballots (in states where the outcome was not in question) so that Gore’s popular-vote margin remains inflated; and now the emergence of Sen. Hillary Clinton as the most grandiose exponent of limousine liberalism in decades.

After her election to the Senate last fall, Hillary had a golden opportunity to shed her bad press and recreate herself — to surprise her critics and win back disillusioned admirers (like me). Instead she went wholehog down Marie Antoinette Boulevard by angling for an inflated book contract and spinning off on a shopping spree for yet another mansion. The Clintons, forever schmoozing with the rich and famous, are fake populists with distorted values.

The opening tone of the Bush administration is about to be set by the Senate hearings for the Ashcroft nomination. Since I know nothing about Ashcroft, I am waiting to see what evidence there may be to support the rabid allegations about his supposed racism. The race card has been mightily overplayed in recent months, and race relations in this country are seriously strained. As is clear in prior columns, I reject the implication by any group — gays, Jews or blacks — that it has special status or privilege in determining national policy on any issue, foreign or domestic.

Perhaps the strident intensification of rhetoric by African-American leaders (such as Jesse Jackson and the Congressional Black Caucus), which began in Florida, is partly related to last year’s controversy in Miami over Elian Gonzalez when demographic projections were publicized showing that Hispanics as a minority group, in Florida and nationwide, are surpassing the black population in size, wealth and influence. One benefit of Bush’s appointment of Colin Powell as Secretary of State and Condoleezza Rice as National Security Advisor is that they are sophisticated role models for international thinking (Rice tackled Russian in school). With their rote claims of victimization and increasing demand for reparations, too many black leaders are defining their people at the lowest common denominator and unproductively binding their identities to the remote past of the slavery era.

Linda Chavez, Bush’s fleeting nominee for Secretary of Labor, may be yesterday’s news, but I’ll add my two cents in case conservatives foolishly try to make her a martyr to political correctness. Chavez is a smart, skillful syndicated columnist and TV commentator who because of her acid views and intermittent administrative career was probably not an ideal candidate for a Cabinet post. She was rightly booted when she failed to be candid with her job interviewer about potential problems in her personal history.

Chavez then had a great chance to demonstrate her character and competence by making a graceful withdrawal announcement. Instead, her press conference last week was a fiasco — narcissistic, sanctimonious and manipulative (with people props shipped in to sing her praises). By bemoaning her hard life and exploiting even her parents’ auto accident as tear-jerking material, Chavez damaged not only her own reputation but that of professional women in general. Until women go cold turkey on all these quavering “feelings,” no one will ever vote them into the White House to command the armed forces. Women need more salt and vinegar and less rancid honey.

The talking heads of American TV seem even more trivial than usual this month since my partner Alison and I spent the holidays in Mexico, where the Mesoamerican ruins give one a breathtaking historical perspective. On the last day of 2000, we were at the seaside Mayan ruin of Tulum, and on the first day of 2001, most propitiously, we were at the great complex of Chichin Itza in the heart of the Yucatan peninsula.

Tulum, dating from the 12th to the 16th centuries A.D., is a vast, walled compound dominated by a pylon-like temple on a high cliff overlooking the Caribbean. The surf, swirling among gigantic boulders on the white sand below, is an unearthly turquoise. In natural drama, Tulum (a shrine of the Descending God oriented toward the rising sun) rivals the Temple of Poseidon crowning Cape Sunium in Greece.

Photographs do not do justice to the sprawling site of Chichin Itza (covering 6 square miles). While the imposing stepped pyramid of Kukulcan (Mayan for Quetzalcsatl, the plumed serpent) is the primary monument, the dozens of other major buildings are astonishing. Furthermore, the graceful placement of temples and the long sightlines (particularly from the Caracol or Astronomical Observatory) sometimes felt curiously modern. I couldn’t help but reflect ruefully on the clutter of the Roman Forum, jammed between the Palatine and Capitoline hills, or on the cramped relationship of the Propylaea and Erechtheum to the Parthenon on the Athenian Acropolis.

Other highlights at Chichin Itza were the ritual ball court, nearly twice the size of a football field, and the nearby Platform of the Skulls, where victims’ decapitated heads were exhibited. The platform is wrapped with four tiers of bold bas-relief squares of grinning skulls still red with original paint. Another unforgettable sight was the overgrown stone ramp, broken in two places centuries ago by the collapse of limestone caverns, leading to the edge of the huge Xtoloc (lizard) cenote, a natural well, where the first migrants may have settled at Chichin as early as 500 B.C.

The Platform of the Skulls in particular, which would clearly fascinate young people, reinforced my conviction, based on teaching experience, that the embattled field of archaeology (often accused of imperialism) should be at the heart of the classroom curriculum at both the primary and secondary levels. Archaeology is an ideal way to synthesize history, art, religion and science. It centers on physical objects and their conservation and shows how to reason from fragmentary evidence. Archaeology, in short, offers splendid practical and intellectual training.

One of my many criticisms of the smart set currently ruling the roost in the humanities departments of the elite schools is that their command of world history and thought is pathetically weak. (This is true even among the so-called New Historicists.) Their cultural commentary is patchy and chaotic because their methodology is purely literary. That is, they play Scrabble with their haphazard, undigested material, which is cut up, whirled around and overlaid with cutesy or pretentious verbal formulas whose provenance is simply the insular conference circuit of the past two decades.

Many of today’s best graduate students are getting their degrees without ever having encountered a genuinely erudite professor in the humanities. The rampant careerism of American academe, which puts a premium on patter, hustling and networking, has yet to be seriously examined. It has had devastating consequences on education by driving free-thinking graduate students and junior faculty out of the profession. When learning is no longer a criterion for employment and promotion, both teaching and scholarship suffer.

My personal reading list, needless to say, never includes current literary theorists, who are as inconsequential as mayflies. Instead I occupy myself with books like Graham Connah’s densely detailed “African Civilizations,” published by Cambridge University Press in 1987. The subtitle of this book, which I recently finished reading, is “Precolonial cities and states in tropical Africa: an archaeological perspective.”

Connah, an Australian professor of archaeology and prehistory who has conducted excavations in Benin, argues for the strictly indigenous origins of the major achievements of early Africa. Among the geographical and climatic zones he examines are the West African savanna, the West African forest, the Middle Nile, the Ethiopian Highlands, the East African Coast and the Zimbabwe Plateau.

“African Civilizations” contains the kind of basic detail that should be the foundation for all speculation about history but that is glaringly missing or amateurishly mishandled by today’s poorly prepared humanities professors, who indiscriminately apply hackneyed poststructuralist theory to everything. Connah rightly connects the growth of social stratification and centralized authority to economic development: coordinated planning is essential for irrigation systems, the gathering of resources and physical materials for building projects, and the collection and distribution of products. Food surpluses raise the standard of living and allow the arts and sciences to flourish.

These well-documented facts about early culture rarely impinge on the thinking of literary theorists, whose premises are narrowly contemporary and who are obsessed with words and disciplines as instruments of “social control” (the Foucault party line). With their sentimental liberalism (under a chic Marxist veneer), they portray all hierarchies as oppressive — except, of course, their own as they beaver their way up the academic ladder. But without hierarchical organization and specialized labor, society remains rudimentary, with most of life occupied with sheer drudgery like carrying water and gathering firewood.

I found particularly interesting Connah’s description of the West African coast as protected from outside intrusion because of its mangrove swamps, lack of natural harbors and prevailing winds — until the middle of the fifteenth century when the lateen sails and stern-post rudder of the Portuguese caravel allowed European explorers to sail into the wind. Prosperous cities sprang up along the east coast of Africa, on the other hand, because the wind and currents of the Indian Ocean change direction twice a year, fostering commercial and cultural interchanges with India and Indonesia that began in antiquity. I was also struck by Connah’s observation that the sickle-cell gene (associated with anemia among American blacks) once functioned as partial protection from malaria in the West African forest.

Connah succinctly reviews the scholarly hypotheses about the decline of Great Zimbabwe (which may also be relevant to the decline of Mayan culture). The massive fortress wall of this city, dating from the 11th century A.D., is the biggest stone structure in sub-Saharan Africa. At its peak, Great Zimbabwe’s population may have exceeded 10,000. The city did not fall by enemy attack or European interference but probably through the exhaustion of its own environment: firewood was used up, game depleted, fields overgrazed and soils over-cultivated.

Although the elevated Zimbabwe Plateau was largely free from the tsetse fly, which afflicts both human beings and livestock, a sanitation crisis may have been created by the large, dense urban population. Sewage breeds disease, contaminating the water supply and threatening public health in expanding societies. This squalid, putrid, intractable problem has been solved outside the Third World by the miracle of modern plumbing, the gift of Western capitalism and the industrial revolution, at which our pampered, armchair leftists like to sneer. A sign should be posted over every campus toilet: “This flush comes to you by courtesy of capitalism.”

Now for our usual pop finale. I won’t deal with football, since I’m still crabby over the low level of play in last weekend’s championship games. A top TV moment of the past month for me was Lifetime cable channel’s “Intimate Portrait” profile of Deidre Hall, the blonde diva of NBC’s daytime soap, “Days of Our Lives.” Eerily ageless, Hall still exudes the composure, magnetism and aloof sexual mystery that have characterized her performance as glamorous Dr. Marlena Evans since 1976.

There was a radiant womanliness about Hall’s persona at her popular height that I often meditated on while writing “Sexual Personae.” This serene, centered quality, seen in “women’s picture” stars of old Hollywood like Lana Turner and still detectable in the 1970s and ’80s in actresses Jaclyn Smith and Anne Archer, is now tragically missing from contemporary pop culture with its jittery anorectics (“Ally McBeal”) and compulsive, sarcastic mantraps (“Sex and the City”). Enduring artistic work is unlikely to come from the bitter sexual wasteland of current pop, where androgyny is just an excuse for protracted adolescence.

As for film, the laurel goes to Turner Classic Movies cable channel for last week’s sensational double feature: Bette Davis impassively shooting her way down the front steps of a moonlit rubber plantation in William Wyler’s “The Letter” (1940), followed by Marlene Dietrich in a blonde Afro uncoiling herself from a gorilla suit to sing “Hot Voodoo” backed by a gaggle of bopping tribal chorines in Josef von Sternberg’s “Blonde Venus” (1932).

When Dietrich in gleaming white top hat and tails saucily sashays down the ramp of a Paris cabaret, she gives a stunning, point-by-point lesson in how to “work” a costume and maximize a prop (a long cigarette holder) — fundamentals our brat pack of current A-list stars don’t have a clue about. We’ll see whether Dietrich’s leading admirer, Madonna (whom Dietrich called “vulgar”), having cleverly married her own junior-league von Sternberg, will ever match Dietrich’s elegant yet empathic mastery of the silver screen.

Camille Paglia is the University Professor of Humanities and Media Studies at the University of the Arts in Philadelphia. Her most recent book is "Break, Blow, Burn: Camille Paglia Reads Forty-Three of the World's Best Poems." You can write her at this address.

Featured Slide Shows

  • Share on Twitter
  • Share on Facebook
  • 1 of 10
  • Close
  • Fullscreen
  • Thumbnails

    Romance novels need a canon

    "Bet Me" by Jennifer Crusie

    A contemporary romantic comedy set to Elvis Costello and lots of luxurious and sinful sugary treats.   Read the whole essay.

    Romance novels need a canon

    "Welcome to Temptation" by Jennifer Crusie

    Another of Crusie's romantic comedies, this one in the shadow of an ostentatiously phallic water tower.   Read the whole essay.

    Romance novels need a canon

    "A Gentleman Undone" by Cecilia Grant

    A Regency romance with beautifully broken people and some seriously steamy sex.   Read the whole essay.

    Romance novels need a canon

    "Black Silk" by Judith Ivory

    A beautifully written, exquisitely slow-building Regency; the plot is centered on a box with some very curious images, as Edward Gorey might say.   Read the whole essay.

    Romance novels need a canon

    "For My Lady's Heart" by Laura Kinsale

    A medieval romance, the period piece functions much like a dystopia, with the courageous lady and noble knight struggling to find happiness despite the authoritarian society.   Read the whole essay.

    Romance novels need a canon

    "Sweet Disorder" by Rose Lerner

    A Regency that uses the limitations on women of the time to good effect; the main character is poor and needs to sell her vote ... or rather her husband's vote. But to sell it, she needs to get a husband first ...   Read the whole essay.

    Romance novels need a canon

    "Frenemy of the People" by Nora Olsen

    Clarissa is sitting at an awards banquet when she suddenly realizes she likes pictures of Kimye for both Kim and Kanye and she is totally bi. So she texts to all her friends, "I am totally bi!" Drama and romance ensue ... but not quite with who she expects. I got an advanced copy of this YA lesbian romance, and I’d urge folks to reserve a copy; it’s a delight.   Read the whole essay.

    Romance novels need a canon

    "The Slightest Provocation" by Pam Rosenthal

    A separated couple works to reconcile against a background of political intrigue; sort of "His Gal Friday" as a spy novel set in the Regency.   Read the whole essay.

    Romance novels need a canon

    "Again" by Kathleen Gilles Seidel

    Set among workers on a period soap opera, it manages to be contemporary and historical both at the same time.   Read the whole essay.

  • Recent Slide Shows

Comments

0 Comments

Comment Preview

Your name will appear as username ( settings | log out )

You may use these HTML tags and attributes: <a href=""> <b> <em> <strong> <i> <blockquote>