Why Google only tells you what you already know

Researchers find that when you search the Web, you're likely to find information that confirms what you'd been thinking all along.

Published January 29, 2008 10:49PM (EST)

"True Enough: Learning to Live in a Post-Fact Society" is a book near and dear to my heart. That's because, wouldn't you know, it's my book! I wrote it, and will be discussing it here a bit in advance of its publication in March.

The book examines a question that's long captivated me, a child of the Internet: Is digital technology advancing truth in the world, or is it distorting it?

By truth, I mean what we call sets of observable, objective, empirical "facts." You might argue -- and many do -- that wide access to information has the capacity to create a more knowledgeable, more tolerant, more rational society.

On the other hand, there's the "9/11 Truth" movement, Swift Boating, and Barack Obama is a Muslim -- the theory that digitization inevitably abets rumor, propaganda, and myth at the expense of truth.

After studying many surveys, psychological and sociological experiments, and economic theories, I come down, in "True Enough," on the negative side: The Internet, as wonderful as it is, isn't as wonderful as we wish -- technology, as my book's jacket flap says, is prompting "the cultural ascendancy of belief over fact." More on this idea later, I promise.

What I want to share today is a new study that supports my thesis. Researchers at the University of New South Wales in Australia set out to determine whether common cognitive biases affect people who are searching on the Web. The chief such bias is what they call the anchoring effect -- the idea that our prior beliefs affect how we process new information.

When you're searching Google -- whether for information on 9/11, the war in Iraq, or the best way to treat athlete's foot -- do you do so with an open mind?

Do you alter your views based on what you find? Or are you more likely to examine what you find through the lens of your previously held views -- and to use the results to reinforce your views, even if they don't naturally jibe?

The new study -- by Enrico Coiera and Annie Lau, and published in the Journal of the American Medical Informatics Association -- suggests it's the latter.

Coiera and Lau focused on health information. They asked two sets of people -- a group of health professionals (doctors and nurses) and another of ordinary folks -- health trivia.

The professionals got questions like "Is there evidence for increased risk of sudden infant-death syndrome (SIDS) in siblings of a baby who died of SIDS?" and "Is there evidence for increased breast and cervical cancer risk after in-vitro fertilization treatment?" (Correct answers, Yes and No).

The ordinary folks got questions like "Is it likely that we can get AIDS from a mosquito bite?" (no) and "Can you catch Hepatitis B from kissing on the cheek?" (also no).

People were asked to answer the questions and to record their level of confidence in their answers. Then they were asked to do a search of online health databases to find the correct answers to the questions; they were given a chance to change their answers if they found any new information that contradicted their original views.

Both the health professionals and the regular people were reluctant to change their views, Coiera and Lau found. It's pretty remarkable: People who answered a question incorrectly before doing a Web search were more likely to still be wrong after the search than those who answered correctly.

For the health professionals, researchers also found that the more confident people were of their answer to the question, the more likely they were to stick to their answer after conducting a search. In other words, the strength of their previous beliefs affected their openness to new ideas.

In a press release the researchers say that their findings suggest risks for people who take to the Web to find health information. "Even if people read the right material, they are stubborn to changing their views," Coiera says. "This means that providing people with the right information on its own may not be enough."

But really, the results would seem to indicate dangers for all of us, regardless of the subject we're searching: Google's high-flying mission is to "organize the world's information and make it universally accessible and useful." But universal access doesn't necessarily correspond to universal understanding.


By Farhad Manjoo

Farhad Manjoo is a Salon staff writer and the author of True Enough: Learning to Live in a Post-Fact Society.

MORE FROM Farhad Manjoo


Related Topics ------------------------------------------