Facts change, people don't

Science and technology keep advancing. But too many of us stubbornly cling to outdated ways of looking at the world

Published September 25, 2012 3:42PM (EDT)

Excerpted from "The Half Life of Facts: Why Everything We Know Has an Expiration Date"

Alan Kay, a pioneering computer scientist, defined technology as “anything that was invented after you were born.” For many of us, this definition of technology captures the whiz-bang innovations of the Web browser and the iPad: anything that appeared recently and is different from what we are used to. In this way, we fail to notice all the older but equally important technologies around us, which can include everything from the pencil to window glass.

But factual inertia in general, even within a single life span, is all around us. Ever speak with a longtime New Yorker and ask for subway directions? You’ll be saddled with information about taking the IND, BMT and IRT, when you were hoping for something that would mention a numbered or lettered train. These mysterious acronyms are the names of the agencies — Independent Subway, Brooklyn-Manhattan Transit, Interborough Rapid Transit — that formerly ran the subways in New York City. Despite the unification that began in the 1940s of these competing systems, many people still refer to them by their former names. Even if facts are changing at one rate, we might only be assimilating them at another.

Adhering to something we know (or at least knew), even in the face of change, is often the rule rather than the exception. On Jan. 13, 1920, the New York Times ridiculed the ideas of Robert H. Goddard. Goddard, a physicist and pioneer in the field of rocketry, was at the time sponsored by the Smithsonian. Nonetheless, the Gray Lady argued in an editorial that thinking that any sort of rocket could ever work in the vacuum of space is essentially foolishness and a blatant disregard for a high school understanding of physics. The editors even went into reasonable detail in order to debunk Goddard.

Luckily, the Times was willing to print a correction. The only hitch: They printed it the day after Apollo 11’s launch in 1969. Three days before humans first walked on the Moon, they recanted their editorial with this bit of understatement:

Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.

Why do we believe in wrong, outdated facts? There are lots of reasons. Kathryn Schulz, in her book "Being Wrong," explores reason after reason why we make errors. Sometimes it has to do with our desire to believe a certain type of truth. Other times it has to do with being contrary. (Schulz notes one surefire way of adhering to a certain viewpoint: Have a close relative take the opposite position.) But oftentimes it is simply due to a certain amount of what I dub factual inertia: the tendency to adhere to out-of-date information well after it has lost its truth.

In the 1840s, Ignaz Semmelweis was a noted physician with a keen eye. While he was a young obstetrician working in the hospitals of Vienna, he noticed a curious difference between mothers who delivered in his division of the hospital and those who delivered at home, or using midwives in the other part of the hospital. Those whose babies were delivered by the physicians at the hospital had a much higher incidence of a disease known as childbed fever, which often causes a woman to die shortly after childbirth, than the women delivering with midwives. Specifically, Semmelweis realized that those parts of the hospital that did not have their obstetricians also perform autopsies had similarly low amounts of childbed fever as home deliveries.

Ignaz Semmelweis argued that the doctors — who weren’t just performing autopsies in addition to deliveries, but were actually going directly from the morgue to the delivery room — were somehow spreading something from the cadavers to the women giving birth, leading to their deaths.

Semmelweis made a simple suggestion: Doctors performing deliveries should wash their hands with a solution of chlorinated lime beforehand. And this worked. It lowered the cases of childbed fever to one tenth the original amount.

However, rather than being lauded for an idea that saved lives for essentially no cost, Semmelweis was ostracized. In the mid-19th century, there was no germ theory. Instead, the dominant paradigm was a certain theory of biology that blamed disease upon imbalances of “humors.” If you’ve ever noted that someone is in a “good humor,” this is a vestige of this bygone medical idea. So the medical establishment for the most part ignored Semmelweis. This quite likely drove him mad, and he spent his final years in an asylum.

This tendency to ignore information simply because it does not fit within one’s worldview is now known as the Semmelweis reflex, or the Semmelweis effect. It is related to its converse, confirmation bias, where you only learn information that adheres to your worldview.

The Semmelweis reflex and confirmation bias are important aspects of our factual inertia. Even if we are confronted with facts that should cause us to update our understanding of the way the world works, we often neglect to do so. We persist in only adding facts to our personal store of knowledge that jibe with what we already know, rather than assimilate new facts irrespective of how they fit into our worldview. This is akin to Daniel Kahneman’s idea of theory-induced blindness: “an adherence to a belief about how the world works that prevents you from seeing how the world really works.”

In general, these biases are useful. They let us quickly fill in gaps in what we don’t know or help us to extrapolate from a bit of information so we can make quick decisions. When it comes to what we can literally see, our ancestors no doubt did this quite often. For example, we could have expected the top of a tree to look like other trees we have seen before, even if it were obscured from view. If it didn’t look right, it should still fit into our mental worldview (for example, it looked strange because there’s a monkey up there). But when it comes to properly evaluating truth and facts, we often bump up against this sort of bias.

Whichever bias we are subject to, factual inertia permeates our entire lives.

Excerpted from "THE HALF-LIFE OF FACTS: WHY EVERYTHING WE KNOW HAS AN EXPIRATION DATE" by Samuel Arbesman. Published by Current, an imprint of Penguin Group (USA). Copyright © Samuel Arbesman, 2012.


By Samuel Arbesman

MORE FROM Samuel Arbesman


Related Topics ------------------------------------------

Neuroscience New York Times Science