"Dick Cheney watches television": The four previously unseen 9/11 photos that will make you hate the evil VP all over again
Dick Cheney watches television
Citizens are misinformed — often badly so. It’s not just that they lack good information — which would merely make them uninformed — they have plenty of bad information that leads them to believe untrue things. Or more likely the other way around: They believe untrue things, and that leads them to collect — even invent — bad information to flesh out what they already believe.
This was vividly illustrated by a 1991 study that found that the more people watched TV during the first Gulf War, the less they knew about fundamental issues and facts, even as they were more likely to support the war. Wanting to believe that the U.S. was involved in a noble cause, for example, only 13 percent knew that when Iraq first threatened to invade Kuwait, the U.S. said it would take no action, while 65 percent falsely “knew” that the U.S. said it would support Kuwait militarily.
But the problem is hardly limited to this one example, or to issues of war and peace more generally. Misinformation in public life isn’t the exception, it’s the rule, and researchers have been grappling with that fact, and its implications, for some time now. A new study published in Social Science Quarterly employs a “knowledge distortion index” and looks at two competing explanations for why this is so — one more top-down, the other more bottom-up — using three Washington state initiatives from the 2006 general election cycle to examine the dynamics of what is going on in this particular sort of political environment.
The study, “How Voters Become Misinformed: An Investigation of the Emergence and Consequences of False Factual Beliefs,” found that “voters’ values and partisanship had the strongest associations with distorted beliefs, which then influenced voting choices. Self-reported levels of exposure to media and campaign messages played a surprisingly limited role,” despite the presence of significantly mistaken “facts,” which were used to help construct the knowledge distortion index.
“Two of the competing theories on how people analyze political issues and develop factual beliefs are heuristics and cultural cognition,” the study’s lead author, Justin Reedy, told Salon. “Both of these theories recognize that citizens can develop distorted factual beliefs because of their political views, but they disagree about how those distortions might happen. Heuristics researchers generally think that citizens have limited attention for politics and try to process information quickly and efficiently.”
This is the more top-down approach, as we’ll soon see.
“People who are fairly politically knowledgeable can figure out whether political information and factual claims match up with their own ideology or not — and therefore whether they should accept or reject those,” Reedy explained. “Cultural cognition researchers, however, see political opinions as driven by deep-seated values about how the world works, and not contingent on someone’s political knowledge.”
Dan Kahan of Yale Law School is the figure most associated with cultural cognition approach (website here). He found the study useful. “I think it worked,” he told Salon. “It adds information.” He also found the broader project of studying the initiative process promising. “The opportunities to observe how people form their views will probably really be enhanced in many cases where there’s some kind of a high-profile referendum,” he said, “and where you can be confident that members of the electorate are engaged by it.”
“The two theories differ on the importance of media and campaign messages, too,” Reedy continued. “Heuristics theory argues that citizens need to get at least some information from the media or from a campaign itself, like endorsements from political parties or key politicians, to help them align their views with their ideology.” This is the sort of thing that campaign workers everywhere fervently believe. But they, too, could be misinformed. “Cultural cognitive theory, though, argues that citizens will get enough cues about nearly any issue in the public sphere to help them align their views on that issue with their underlying values.”
Finally, Reedy said, “The last distinction between the two models is on policy preferences: Heuristics researchers would argue that once a citizen has developed a factual belief, whether distorted or not, that belief will become an important factor in their decision on a public policy issue. Cultural cognition, however, sees a citizen’s core values as the key in them deciding on a policy issue — the distorted factual beliefs are just another phenomenon that happens along the way.”
Before discussing how the two models measured up, we need a better understanding of what went into the study, which involved a combination of new and tried-and-true approaches. On the “new” side, the knowledge distortion index — developed by the same team in an earlier study — is a particularly promising tool. “My colleagues and I thought it would be useful to be able to quantify the way that someone’s factual beliefs about politics could be distorted. Other researchers had done similar research on distorted factual beliefs, but we wanted to create some kind of index that helped show how a person’s factual beliefs were systematically distorted in a partisan or ideological direction,” Reedy explained. “That was the idea behind the knowledge distortion index – to not just show that some people had the facts around a political topic wrong, but to quantify how those factual beliefs might be incorrect in a systematic, ideologically driven way.”
Just imagine if pollsters routinely adopted the knowledge distortion index in covering any public issue. Imagine not just seeing cross-tabs showing the difference between liberals vs. conservatives or Democrats vs. Republicans, but also seeing how people’s opinions varied according to the balance of mistaken factual beliefs. Simply having the relevant false beliefs for any issue identified in the polling process would be an eye-opening experience. There have been a handful of polls showing how birthers differ from non-birthers in their views, and those have been rather illuminating in themselves. But that’s just a single piece of misinformation on a single — though broadly significant — subject. Imagine if it simply became routine for pollsters to measure how distorted people’s “knowledge” was in the course of eliciting their opinions.
“Political debate and policymaking are hard enough, but if people from opposing ideological camps come in with their own sets of facts, that makes it really tough to have a vibrant debate that leads to good public policy,” Reedy said. “That’s a big part of why we created the knowledge distortion index and have been studying these issues, is to try to help figure out how to combat the problem of ideological distortion of political knowledge.”
In this study, eight different items — gleaned from “surveying campaign websites, news reports, and commentary” — were used to create an index specific to each of the three initiatives. The first, “Landowner Compensation Policy, would have rolled back land-use regulations by forcing the state to reimburse landowners for expenses incurred from those regulations,” the paper explained. The second, “Renewable Energy Mandate, required a proportion of the state’s energy to come from renewable sources.” The third, “Estate Tax Repeal,” was self-explanatory. Only the second was approved by the voters. A distortion index item for the Landowner Compensation Policy, for example, was “Washington landowners can be forced to leave their land unused if it provides habitat for species that are not even endangered” — a false statement that 45.2 percent of respondents nonetheless identified as “true.” While all three scales were balanced so that “liberal” and “conservative” distortion scores were equally possible, conservative distortions predominated in all three cases, though only modestly in two of them.
These are not your typical hot-button culture-war issues, nor are they dry, purely technical questions, or issues so specific as to defy comparison. They represented the broad middle range of issues that make up a significant portion of the public debates that Americans have carried on in the public square since the earliest days of the republic.While it wouldn’t be warranted to assume they’re representative of all public questions, they do make a good place to start. In the study itself, the authors noted future research possibilities:
Reflecting on the technical nature of the ballot measures in the present study, we believe it would advance this line of research to assess the importance of values, belief distortion, and knowledge in shaping voters’ beliefs on issues that are more heavily values-based or culturally contested (Lakoff, 2002), such as gay rights or abortion. Voters may be less likely to hold incorrect factual beliefs on those higher-profile issues simply because there is much more information about those issues present in the public sphere. On the other hand, the more obvious connection between values and policy for such issues may result in an even greater distortion of empirical beliefs to fit with the disparate values held by opposing sides.
In short, this current study has established a useful baseline for future studies.
In addition to measuring knowledge distortion for each item, the study also measured value orientations to compare with knowledge distortion — another promising new idea. The authors used a combination of responses (agree/disagree) to two statements for each value orientation. Two initiatives involved a single value orientation. For the Landowner Compensation Policy, the statements concerned government’s role in regulating land use. For the Renewable Energy Mandate, the statements concerned how society should approach the production of clean energy. For the Estate Tax Repeal, one set of statements concerned the primacy of property rights, while a second set concerned commitment to public education, which is the primary beneficiary of Washington’s estate tax.
This approach “offered more concrete measures of respondents’ issue-relevant value orientations than could be obtained by left-right ideology,” the paper noted, “or abstract cultural orientation measures” — referring to the two-factor framework (hierarchy/egalitarianism and individualism/communitarianism) used by Kahan and others in the cultural cognition tradition.
Reedy acknowledged that this represented a potential weakness in testing the cultural cognition hypothesis generally. “This is a good point,” he told Salon. The study relied on survey questions they were able to get included in a poll that was run by the University of Washington Department of Political Science, he explained. “So we were a bit limited on the space we had available in the survey,” and thus “didn’t have enough room to include more general measures of cultural orientation like Dan Kahan typically uses.”
But Kahan told Salon that he didn’t regard this as a significant problem. “To me, cultural cognition is a research program” concerned with how “people are forming their understandings about facts in relation to evidence in a particular way in political life and related domains,” he said. “Now, operationalizing it, there are different ways to do that,” which can include whatever tools happen to be available. Identifying it too narrowly with the two-factor model as an alternative to single-factor left/right model is “kind of missing the point,” Kahan said.
“Cultural cognition says there are these kind of affinities — you can’t directly observe them, there are different ways we can measure them — and makes a claim about how it is that they influence people’s information processing,” Kahan explained. He’s even used partisan affiliation himself, but that didn’t mean he wasn’t doing cultural cognition work. “I just see myself as using an alternative way to measure what the motivating affinities are.” Sometimes the affinities may be more simple; other times, more complex. So he had no problem with the value scales Reedy and his co-authors used.
In contrast to these recent innovations, the study also measured political knowledge, a very old practice that’s been used for half a century or more, using a standard approach of asking people to identify a mix of public officials and parties in control of various bodies at the state and federal levels.
With those three measures explained — knowledge distortion, value orientation and political knowledge — we’re set to appreciate the study’s results.
“We set out to test these two competing views in a couple of different ways,” Reedy said. “First, we looked at whether someone’s political knowledge had a moderating effect on how much their values shaped their distorted factual beliefs — that is, whether people at higher levels of political knowledge were more likely to have values-based knowledge distortion than, say, people at lower levels of political knowledge. Our results here were a bit mixed. On one of the issues, more knowledgeable voters were indeed more likely to develop distorted factual beliefs.” This was the Landowner Compensation Policy. In this case, “those with greater political knowledge had more distorted empirical beliefs than did their low-knowledge counterparts,” the study reported.
“However, the other two issues fit more with cultural cognition theory, in that overall political knowledge was not important in the development of distorted factual beliefs,” Reedy said.
The second test produced more unified results. “We also tested whether people needed to be exposed to news media messages and campaign messages in order to pick up these distorted factual beliefs about political topics,” Reedy explained. “We found no connections between self-reported exposure to media and campaign messages and knowledge distortion, which gives more support to the cultural cognition view, that people are able to connect their values with a political issue regardless of media messages about that issue.”
This doesn’t necessarily mean that there were no such effects, however. As the study itself explains, “the most straightforward explanations for these findings are either that message effects are too small for samples such as ours to detect or that we lacked sufficiently sensitive message exposure measures. We relied on self reported measures of media and campaign exposure, which are subject to errors of varying magnitude.”
But these sorts of caveats are always involved in scientific research, as even Kahan pointed out. Studies add weight to one view or another, nothing more. “It’s not like a study is kind of a definitive contest, or a duel, some position is going to shoot the other from a pace of however many yards,” he said.
“Third,” Reedy said, “we looked at the connection between people’s distorted factual beliefs and their views on the public policy issues related to those. We ran statistical tests to see the effects of the typical factors on someone’s opinion on a policy issue — things like their demographics and political values — and also the impact of knowledge distortion. We found that knowledge distortion did indeed have an independent effect on one’s policy preferences, which is more in keeping with the heuristic view of opinion formation.”
This inference is arguably subject to dispute, however. Kahan himself was untroubled by the results, and the study itself said that “those with more knowledge more consistently expressed their value orientations through their votes,” which is one of the most important findings that Kahan has consisting stressed as validating his approach: those more engaged and more knowledgeable politically tend to be more polarized on issues like global warming, for example.
What is certain is that a growing community of researchers, and those who follow them, are developing an increasingly textured feel for the phenomenon of knowledge distortion — a phenomenon that not too long ago wasn’t even acknowledged to exist. Reedy himself expressed a similar sentiment. “Our strongest results, I think, are just confirming that this values-based political knowledge distortion is happening and it’s having an independent effect on one’s vote choice,” he concluded.
As the study itself said:
Whatever future refinements may be made to the values-based distortion model, the unsettling evidence remains that many voters are systematically misinformed on political issues, and those erroneous factual beliefs appear to influence how they mark their ballot on election day.
This is a disturbingly serious problem for a political system that purports to not only reflect the “will of the people,” but that also respects reality as a basic matter of course. “Facts are stubborn things,” John Adams observed — a true man of the Enlightenment. “Facts are stupid things,” Ronald Reagan famously misquoted him. It’s painfully obvious whose world we’re living in now. It’s a good deal less obvious how to escape. But thanks to folks like Reedy and Kahan, we’ve at least got a chance to start working on that.
Dick Cheney watches television
Dick Cheney watches television
Dick Cheney watches television
Dick Cheney watches television