For the Democrats, there was Boston. And for the media, there was another B-word. From USA Today to Wolf Blitzer on CNN, journalists buzzed last week about whether John Kerry got enough of a "bounce" in the polls from his nominating convention. The New York Times, the Washington Post, the Chicago Tribune and the Los Angeles Times collectively spent 10,735 words over two weeks on a shift of a few percentage points -- about twice as many words as Kerry's hour-long acceptance speech.
Polling frenzy is not restricted to convention-week surveys, of course. Every major American news network and newspaper has some kind of polling outfit in-house. And as the November election nears, Americans will get hit with the TV networks' daily reporting of the three-day rolling averages known as tracking polls. Rasmussen Reports, a smaller polling firm, is already running tracking polls in battleground states. "It's so they can have a new story every day," says Robert Blendon, who teaches political polling at Harvard's Kennedy School of Government.
While it may seem like overkill, Americans crave coverage of the peaks and valleys of political polling, according to a recent study in the Journal of Politics. In fact, Americans prefer reporting on which politician is ahead and why to harder news coverage and even scandal stories -- conventionally considered the bread and butter of modern news organizations. Demand for quick numbers has also propped up a budding cottage industry of smaller firms, such as Rasmussen Reports and SurveyUSA, that sometimes use unconventional methods.
In such a close election, the politics of polling has higher stakes, and the campaigns don't let an unfavorable poll get reported without a fight. In June, the Los Angeles Times released a poll showing Bush behind by seven points, and Bush-Cheney campaign pollster Matthew Dowd went into attack mode. Dowd sent a warning to ABC News: "A note of caution: be very careful in reporting Los Angeles Times poll. It is a mess. Bush is leading independents by three, ahead among Republicans by a larger margin than Kerry is ahead among Dems, and we are down by seven. Outrageous. And it gets worse. They have Dems leading generic congressional ballot by 19. This means this poll is too Democratic by 10 to 12 points."
ABC News reported the poll anyway, but ABC's political unit posted Dowd's objections in its online weblog, The Note. The Los Angeles Times' poll czar, Susan Pinkus, shot back in a scathing and technical response to ABC:
"I feel that I have to respond to [Dowd's] assertion that the poll is a 'mess.' His negative spin of this poll is, quite truthfully, not unexpected. The Times makes every effort to use sound methodological techniques that are used by most reputable research and polling organizations ... The Times does RDD (random digit dialing) sampling which reaches households with listed and unlisted telephone numbers. The poll weights slightly (for minor corrections) based on census data for sex, race, age and education and does not weight for party ID. Party ID is a moving variable that changes from one election to another, and weighting by party registration makes no sense nationally because many states don't have their voters register by party and some states don't have voters register to vote until the day of the election."
Leave it to statisticians to make methodology personal. Yet with so many Americans paying attention, it makes sense that the methodological minutiae of a single survey became this controversial. Errant polls can have profound effects, and not every poll is reliable. Just look at the Wisconsin primary race last February between John Kerry and John Edwards. Right before the primary, an American Research Group poll showed Kerry with a 53 to 16 percent lead over Edwards, and an MSNBC/Reuters/Zogby poll had Kerry with 47 percent to Edwards' 20 percent. Edwards got within six points of Kerry in the actual primary election.
"The polls [in Wisconsin] were way off," said Brad Coker, a pollster for Mason-Dixon Research. Would Edwards have taken the state if the polls hadn't built a bandwagon for the Kerry campaign? Probably not, Coker said, but dubious polls like those in Wisconsin "hurt a candidate or a campaign trying to raise money." Coker continued, "If you're down 20 points, it gets awful hard to get people to write big checks." Think John Kerry in January, who had to mortgage his own home to prop up his campaign before he won the Iowa caucuses.
And once a bad poll is out there, it is difficult to rein in the faulty data. Talking heads repeat poll numbers with impunity. "Take the one wrong poll, and all the pundits latch on," said Michael McDonald of the Brookings Institution. In the context of a presidential race that could hinge on a few thousand votes in a handful of swing states, poll-driven perception might yet trump reality.
Yet the methods and math behind political polls are often far from intelligible to the public or the press. Pollsters constitute a cadre of number crunchers obsessed with margins of error, sample screens and statistical weighting. In other words, they speak a language most Americans -- even most members of the fourth estate -- do not. This leaves plenty of room for the echo effects of a rogue poll to have real impact. With this in mind, Salon ranks the most frequently cited and covered polling outfits.
First, a disclaimer. Even the best polls will be wrong about one time in 20, both Blendon and McDonald say. And Blendon cautions that even the most trustworthy polls will often differ because pollsters place different weights on their results, use different screens on their samples, and poll on different days and at different times.
Polls to trust
The traditional gold standard in the polling world is the Gallup organization, which has covered presidential elections since George Gallup predicted a Roosevelt victory in 1936. Wonks can read all about Gallup's standards on the organization's Web site, which is as canonical as it is unexciting. Gallup teams up with CNN and USA Today for horserace polls.
Most pollsters and all of the experts Salon contacted say that the major papers and television networks are also reliable. Despite Dowd's objections, the Los Angeles Times polls look methodologically pristine, as do the CBS News/New York Times, Wall Street Journal/NBC News, and ABC News/Washington Post polls.
Several pollsters said that Fox News, which hired the research firm Opinion Dynamics to do its polling, tends to have a Republican bias in its survey results. But in terms of methodology, Fox News appears to be a "model survey," Kyle Smith, a statistician at the University of New Mexico, wrote in an e-mail to Salon. In particular, the questions Fox asks are actually fair and balanced. "If you want a good standard to judge other survey scripts, I'd suggest using Fox's," Smith wrote. The Fox bias that some pollsters allege might come from the way numbers are reported, not gathered, Blendon said.
The red flags
The Zogby International polling firm has been using controversial methods for years, and this election cycle is no different. The Zogby Interactive Battleground Poll -- regularly cited in the blogosphere and published on the Wall Street Journal's Web site -- is an online poll conducted via e-mail. If this sounds dubious, that's because it is. Online polling has a spotty track record and remains an unproven method for gauging public opinion. Nancy Belden, president of the standard-setting American Association of Public Opinion Researchers, advises wariness about "any organization that uses online polling which necessarily excludes people who are not online." CNN polling chief Keating Holland agreed, saying online polling is more self-selecting and therefore less accurate. "When you're polling by telephone there's this big thing that makes this noise when I want to ask you questions. When you're polling by the Internet, there's no big bell." For the same reason, watch out for Harris Interactive polls.
Zogby does conduct telephone surveys that are less unorthodox, though they are still controversial.
Rasmussen Reports uses another questionable technique to gather its polling data: interactive voice technology (IVT), in which a computer does the calling and the interviewing. Though Rasmussen himself said that it is "easier to get people to talk to a computer than it used to be," polling units that use IVT have a reputation for low response rates. "It's a far cry from having an interviewer," Belden said. Holland does not let CNN report results from IVT polls. "I find [IVT] polls unreliable," he said. "I've actually been polled, and it was far too easy to screw around with it, which I did." He added, "People feel a bigger obligation to tell the truth to a real person." SurveyUSA, another prolific polling organization, also uses IVT.
Also watch out for Rasmussen's tracking polls, which combine the IVT trouble with the problems of a tracking poll. Tracking polls -- which are updated daily -- tend to be volatile and unreliable because "different kinds of people are at home on different nights of the week," Blendon explained. Be wary of any tracking poll you see -- even from the big names.
American Research Group (ARG) polls are out all the time with fresh numbers from battleground states. The organization, however, would not release the most vital piece of information about its work -- who pays for it -- when Salon asked. ARG spokesman Dick Bennett told Salon that "news organizations" generally subscribe to fund ARG polls. But without full disclosure, the survey data cannot be completely trusted.
Local state polls, often conducted by cash-strapped local newspapers or television stations, are also often unreliable. "Doing it and doing it well is expensive," Blendon said. Pinkus said that there are some good local polling organizations, such as Quinnipiac University in Connecticut. But sticking with the more reputable national organizations, even for local numbers, is safer. Also suspicious: Polls sponsored by a political campaign or interest group. Local news organizations sometimes pick these polls up, but they are notoriously biased. "Sometimes they almost beat the [respondents] into answering a certain way," Blendon said.
All the experts Salon contacted agreed that the worst polls out there are the fantastically useless Internet surveys like those CNN's Lou Dobbs asks his viewers to fill out on his Web site. When he says they are unscientific, he means it. They're really just for fun.
Note that a lot of the red-flagged organizations come out with polls early and often. That's because IVT or online survey technology makes polling cheap and quick -- and too good to be true. But because of the seductive frequency of the polls, they show up all over, particularly in the blogosphere. But take their results with a grain of salt.
Polls not included...
These ratings are not comprehensive; there are other good and bad polls out there. In general, trust a poll only as much as you trust the news source, and be wary of most polls published online -- even on big-name sites like the Wall Street Journal's, which regularly posts Zogby Interactive polls with a lot of flashy graphics. Polls that would never make it into print or on TV regularly show up online, just for the amusement of readers. Finally, good polls should be digested for what they are worth -- they are just snapshots of public opinion at one time, not a prediction about November.
If polling is to drive much of the election coverage this year, as it already has, becoming an informed poll watcher is more vital than ever. The bottom line: Choose a few solid polls to follow and ignore the rest. Be sure to look for agreement among those polls and be wary of outliers -- that is, polls that disagree with the pack. Because even the best pollsters are wrong sometimes.