Rank Error

The top ten reasons why the media's obsession with lists is inane


Todd Gitlin
September 13, 1996 1:23PM (UTC)

Here is my list of the worst best-of lists.
1. The best pizza in New York.

2. The best movie.

3. The best play.

4. The ten best David Letterman lists.

5. The NBA point guard with the highest free-throw percentage on Tuesdays at the Garden.

6. The most influential political consultant of the decade. (Whoops.)
Enough already. Look how desperate people are to get listed somewhere for something. The Commissioner of Parks has a staffer count the number of hands who pet his dog, and Sri Chinmoy's disciples undertake all manner of weird exploits, both in order to get in the Guinness Book of Records. And universities tout their U. S. News and World Report rankings as they troll for alumni gifts and prospective customers.
There's good news and bad news about lists. The good news is that winners get to dine out on their reputations at top-ranked restaurants, surely, and why not? Oscar-winning movies, Pulitzer-Prize winning books, Tony-winning plays, and the like may get a second life, also sometimes deserved. Documentary filmmakers get a measure of fundraising help. Universities get to pick and choose among the applicants they want to pick and choose among. Parents and prospective students may hear of colleges they'd never heard of before.
The bad news is that rankings are frequently guilty of what Alfred North Whitehead (one of the ten most interesting philosophers of the twentieth century, no doubt) once called "misplaced concretism." They frequently assume that important matters exist in quantitative units that can be laid end-to-end and counted. They emphasize precisely what can be counted, and sweep aside what cannot be. They boost a secular society's version of canonization. Numbers 'R Us, even when the rankers duly note (see asterisk) that their rankings should not be overesteemed.
In the mania for ranking, shoppers tend to assume that rankers have reason to know what they're talking about. Numbers look hard and fast. Academics may get promoted on the basis of the number of times their names come up in the citation index which counts the number of times their colleagues mention their work in journal articles. Ever-faster silicon chips power ever-cheaper computers to pump them out faster with every passing byte. Hence the numerals that clutter up the screen of every sports broadcast. Hence the poll fetish that is sweeping throughout the world inspired, if that is the word, by number-crunchers.
Rankings, of course, are only as good as data, and data are smeared with fingerprints. Everyone who uses ratings, rankings and prize lists should keep salt-shakers with fat holes at the ready. Surely the Nobel list might suffer if readers understood that neither Tolstoy, Henry James nor Borges was honored. Yet the mania spreads far and wide as competition knows no bounds, whether it be for the Booker Prize or prospects for medical school admissions and legal partnerships. Shame fails to stop the high-rankers from touting their high ranks without itemizing the footnotes.
In recent years, U. S. News and World Report has set out to distinguish itself from its competition by emphasizing "news you can use," in which category it publishes special issues, later expanded into books, ranking colleges and universities. "America's Best Colleges" has just hit the stands. According to Larry Van Dyne's informative piece in the September Washingtonian, this is U. S. News' best-selling issue of the year, a total of 2.3 million issues including subscriptions and newsstand sales. "America's Best Graduate Schools" follows in the winter. Van Dyne writes that colleges have been known to resort to such techniques as sending cookies to prospective applicants to enlarge their pool, thereby pumping up their selectivity ratio, which is a factor that U. S. News takes into account. University presidents, who know on what list their bread is buttered, lobby at U. S. News offices. It's spin, spin, spin for the home team.
In a recent issue of Insights, the journal of the Association of Schools of Journalism and Mass Communication, Professors Tom Goldstein of Berkeley and Ted Glasser of Stanford raise nettlesome questions about the rankings of graduate schools. Under the headline, "Ratings Game Reaches New Low," they note that in its 1996 ranking of graduate journalism schools, Stanford ranked in the top five in broadcast journalism -- an extraordinary achievement, considering that Stanford has no such program. The University of Minnesota ranked fifteenth in print journalism, despite having shut down its graduate journalism program one year before.
Alvin Sanoff, a U. S. News editor who works on the college rankings, admits to "some responsibility" for the errors on Stanford and Minnesota. He told me that when a woman with the title "coordinator" was asked by a U. S. News researcher if Stanford did indeed have a broadcast program, she referred the researcher to someone else, who said "You can say that we do" -- a statement that turned out to be false. (A Stanford official later said that the person who affirmed the existence of the program was not qualified to do so, according to Sanoff.) Sanoff's case for including Stanford is that it was still possible to concentrate on broadcasting by compiling a certain sequence of courses.
As for Minnesota, Sanoff says that U. S. News took a list of journalism and mass communications programs compiled by Lee Becker of Ohio State University, and circulated it to both practitioners and academics for the survey. The practitioners didn't answer in large numbers, and as for the academics, they were evidently willing to rank a nonexistent program. A rather stirring reason to doubt the entire procedure, one would think. Becker has said that he did not intend that his list be used in this fashion. Goldstein says that the list includes many schools that are more involved in teaching communications, advertising and public relations than journalism, and that their deans can't make informed assessments of journalism programs proper.
Sanoff says that U. S. News revises its methods all the time, and that its door is open for constructive suggestions. Goldstein says, "I can't say for sure what all the indicators should be, but U. S. News should go out and report. Lists are a substitute for reporting."
U. S. News is not the only misplacing concretizer in the education business, only the best-circulating one. There is also a ranking system of graduate academic departments published periodically by the National Academy of Sciences. This one is supposed to be the result of a survey of professors. I professed for sixteen years in a sociology department (Berkeley) that always ranked first, second, or third in the country. Not once was I ever asked my opinion, nor were several of my colleagues whom I once asked.
"Measure what can be measured," wrote James Fallows in his fine book "Breaking the News." There, he was properly critical of reporters falling all over polls and neglecting to note that oftentimes people know next to nothing about the terms on which the citizenry are invited to express opinions. The tendency to pile up numbers and let them substitute for meaning reaches its point of absurdity in the rankings. Papers now routinely report box-office results. Museums count bodies, publishers live and die by their own numbers. Nine out of 10 doctors are still recommending. Fallows is now top editor at U. S. News -- surely one of the most promising appointments in years. He should get out his salt-shaker, as should we all.


Todd Gitlin

Todd Gitlin teaches at Columbia University, writes regularly for BillMoyers.com and Tablet, and is the author, most recently, of Occupy Nation: The Roots, the Spirit, and the Promise of Occupy Wall Street.

MORE FROM Todd Gitlin



Fearless journalism
in your inbox every day

Sign up for our free newsletter

• • •