Read The Half-Life of Facts Online
Authors: Samuel Arbesman
It’s easy to remember what is normal and the “correct” state of affairs when we start something new, after having set our baseline, even if we only do this subconsciously. But we mustn’t let that guide all of our thinking, because the result can be catastrophic.
Of course, shifting baseline syndrome can even affect us in smaller, more subtle ways. Alan Kay, a pioneering computer scientist, defined technology as “anything that was invented after you were born.” For many of us, this definition of technology captures the whiz-bang innovations of the Web browser and the iPad: anything that appeared recently and is different from what we are used to. In this way, we fail to notice all the older but equally important technologies around us, which can include everything from the pencil to window glass.
But factual inertia in general, even within a single life span, is all around us. Ever speak with a longtime New Yorker and ask for
subway directions? You’ll be saddled with information about taking the IND, BMT, and IRT, when you were hoping for something that would mention a numbered or lettered train. These mysterious acronyms are the names of the agencies—Independent Subway, Brooklyn-Manhattan Transit, Interborough Rapid Transit—that formerly ran the subways in New York City. Despite the unification that began in the 1940s of these competing systems, many people still refer to them by their former names. Even if facts are changing at one rate, we might only be assimilating them at another.
Adhering to something we know (or at least knew), even in the face of change, is often the rule rather than the exception. On January 13, 1920, the
New York Times
ridiculed the ideas of Robert H. Goddard. Goddard, a physicist and pioneer in the field of rocketry, was at the time sponsored by the Smithsonian. Nonetheless, the Gray Lady argued in an editorial that thinking that any sort of rocket could ever work in the vacuum of space is essentially foolishness and a blatant disregard for a high school understanding of physics. The editors even went into reasonable detail in order to debunk Goddard.
Luckily, the
Times
was willing to print a correction. The only hitch: They printed it the day after Apollo 11’s launch in 1969. Three days before humans first walked on the moon, they recanted their editorial with this bit of understatement:
Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.
Why do we believe in wrong, outdated facts? There are lots of reasons. Kathryn Schulz, in her book
Being Wrong
, explores reason after reason why we make errors. Sometimes it has to do with our desire to believe a certain type of truth. Other times it has to do with being contrary (Schulz notes one surefire way of adhering to a certain viewpoint: Have a close relative take the opposite position). But oftentimes it is simply due to a certain amount of what
I dub factual inertia: the tendency to adhere to out-of-date information well after it has lost its truth.
Factual inertia takes many forms, and these are described by the relatively recent field of evolutionary psychology. Evolutionary psychology, far from sweeping our biases under the rug, embraces them, and even tries to understand the evolutionary benefit that might have accrued to what may be viewed as deficits.
So what forms can factual inertia take? Look to the lyrics of Bradley Wray.
In December 2009, Bradley Wray was preparing his high school students for a test in his Advanced Placement psychology class. Wray developed a moderately catchy song for his students that would help them review the material and posted a video of it online.
What was the topic of this song?
Cognitive bias
. There is a whole set of psychological quirks we are saddled with as part of our evolutionary baggage. While these quirks might have helped us on the savannah to figure out how the seasons change and where food might be year after year, they are not always the most useful in our interconnected, highly complex, and fast-moving world. These quirks are known as cognitive biases, and there are lots of them, creating a publishing cottage industry devoted to chronicling them.
As sung by Wray, here are a couple (the lyrics are far from Grammy quality):
I’m biased because I put you in a category in which you may or may not belong
Representativeness Bias: don’t stereotype this song….
I’m biased because I take credit for success, but no blame for failure.
Self-Serving Bias: my success and your failure.
These biases are found throughout our lives. Many people are familiar with self-serving bias, even if they might not realize it: It
happens all the time in sports. In hockey or soccer, if the team wins, the goal scorer is lauded. But if the team loses? The goalie gets the short end of the deal. The other players are the beneficiaries of a certain amount of self-serving bias—praise for success, without the burden of failure—at least that’s how the media portray it, even if they are not subject to this cognitive bias themselves. There are well over a hundred of these biases that have been cataloged.
. . .
IN
the 1840s, Ignaz Semmelweis was a noted physician with a keen eye. While he was a young obstetrician working in the hospitals of Vienna, he noticed a curious difference between mothers who delivered in his division of the hospital and those who delivered at home, or using midwives in the other part of the hospital. Those whose babies were delivered by the physicians at the hospital had a much higher incidence of a disease known as childbed fever, which often causes a woman to die shortly after childbirth, than the women delivering with midwives. Specifically, Semmelweis realized that those parts of the hospital that did not have their obstetricians also perform autopsies had similarly low amounts of childbed fever as home deliveries.
Ignaz Semmelweis argued that the doctors—who weren’t just performing autopsies in addition to deliveries but were actually going directly from the morgue to the delivery room—were somehow spreading something from the cadavers to the women giving birth, leading to their deaths.
Semmelweis made a simple suggestion: Doctors performing deliveries should wash their hands with a solution of chlorinated lime beforehand. And this worked. It lowered the cases of childbed fever to one tenth the original amount.
However, rather than being lauded for an idea that saved lives for essentially no cost, Semmelweis’s ideas failed to gain traction. Some of this was due to the fact that there was no germ theory at the time to explain the spread of the disease. Therefore, those who had a stake in the then-current theories refused to recognize that Semmelweis
was making important points. And fawning discussions of Semmelweis leave it at that: he was ignored by the medical establishment because his ideas did not fit the then-current worldview. However, it seems that the main reasons Semmelweis’s theories did not catch on were more complicated and even perhaps self-inflicted, ranging from his reluctance to publish to the paucity of some of his evidence.
Whatever the true reasons for the failure of his ideas, this tendency to ignore information simply because it does not fit within one’s worldview is often associated with the story of Semmelweis and is troubling. This is related to the converse situation, confirmation bias, where you only learn information that adheres to your worldview.
These biases are important aspects of our factual inertia. Even if we are confronted with facts that should cause us to update our understanding of the way the world works, we often neglect to do so. We persist in only adding facts to our personal store of knowledge that jibe with what we already know, rather than assimilate new facts irrespective of how they fit into our worldview. This is akin to Daniel Kahneman’s idea of
theory-induced blindness
: “an adherence to a belief about how the world works that prevents you from seeing how the world really works.”
In general, these biases are useful. They let us quickly fill in gaps in what we don’t know or help us to extrapolate from a bit of information so we can make quick decisions. When it comes to what we can literally see, our ancestors no doubt did this quite often. For example, we could have expected the top of a tree to look like other trees we have seen before, even if it were obscured from view. If it didn’t look right, it should still fit into our mental worldview (for example, it looked strange because there was a monkey up there). But when it comes to properly evaluating truth and facts, we often bump up against this sort of bias.
Confirmation bias is only one of many cognitive biases, and it is related to another problem of our mental machinery:
change blindness
. This refers to a quirk of our visual-processing system. When we concentrate on one thing or task very intently, we ignore everything else, even things that are important, or at the very least,
surprising. A series of seminal experiments were done in this field by Christopher Chabris and Daniel Simons, professors at Union College and the University of Illinois, respectively. You’ve probably seen their experiments, in the form of fun little videos online.
In one, subjects are shown a video of individuals in a gymnasium. The people in the video begin passing basketballs to one another and the subjects are supposed to keep track of the types of passes (such as bounce passes) or who passes to whom, since the players have different colored jerseys.
Then something intriguing happens. Partway through the video, a woman dressed in a full-body gorilla suit walks among the basketball players. She stops in the center, beats her chest in true gorilla style, and continues walking through the players. Of course, this is surprising and strange and all kinds of adjectives that describe something very different from a normal group of people passing basketballs to one another.
But here’s the startling thing: 50 percent of the observers of this video miss the gorilla entirely. This change blindness, also known as inattentional blindness, is a quirk of our information-processing system. When looking for one thing, we completely ignore everything else around us.
This bug is turned into a feature by magicians, who exploit our change blindness through the use of misdirection. A magician gets you to concentrate on his left hand, while the right hand is doing all the important sleight-of-hand. This kind of thing can even fool trained magicians, who are trying to learn the illusion.
One common way that magicians learn a new trick is through an instructional video. The magician will show you the trick, through the eyes of the spectator, then explain it, show it again, and then show it from a different perspective, or at least more slowly. An illusion that I once observed involved the use of a thumb tip—a false rubber thumb that can be used to conceal various objects, such as handkerchiefs. After the magician showed the trick, he informed the viewer that he made sure to make it easy to follow by using a bright red thumb tip.
Upon hearing this, I was shocked. I had been so focused on all the other aspects of the illusion, and the magician’s use of misdirection, that I had entirely missed what was right before my eyes: a ridiculous red artificial thumb. I had been a victim of change blindness.
Change blindness in the world of facts and knowledge is also a problem. Sometimes we are exposed to new facts and simply filter them out. But more often we have to go out of our way in order to learn something new. Our blindness is not a failure to see the new fact; it’s a failure to see that the facts in our minds have the potential to be out-of-date at all. It’s a lot easier to keep on quoting a fact you learned a few years ago, after having read it in a magazine, than to decide it’s time to take a closer look at the current ten largest cities in the United States, for example, and notice that they are far different from what we learned when we were younger.
But whichever bias we are subject to, factual inertia permeates our entire lives.
. . .
A
clear example of how we often neglect to respond to change is when it comes to writing the date or the year on documents.
Have you ever written the wrong year during the first weeks of January? This happens in everything from homework assignments to legal documents. This can even happen in the extreme. On May 24, 2011, President Barack Obama visited the United Kingdom. While making a stop at Westminster Abbey, Obama decided to sign the guestbook. He wrote a very nice little note about the special relationship that the United States shares with Great Britain. The only problem was that he dated his signature May 24, 2008. Perhaps he hadn’t had to write the date since he won election three years earlier. Either way, the inability to respond to change is not an issue only for the everyday; it reaches all the way to the top. Thankfully, even the law has taken into account our foibles and our inability to always update our facts. In courts, intent is what matters, and not unconscious muscle memory, so if you do this on a legal document, you’re generally fine.
I decided to conduct a simple experiment to actually get a handle on people’s factual inertia. To do this, I used a Web site created by Amazon called Mechanical Turk. The label
Mechanical Turk
derives from a well-known hoax from the eighteenth and nineteenth centuries. The Turk was a complex device that was displayed all throughout Europe. While appearing to be a chess-playing automaton, the Turk actually had a person in a hidden compartment, controlling the machine.
In homage to this, Amazon named its online labor market—a clearinghouse for simple tasks humans can easily perform but computers cannot—Mechanical Turk. These tasks include things like labeling photographs when they are posted, and Turkers, as the laborers are called, will often solve these problems for pennies. Mechanical Turk has recently become a wonderful test bed for social science experiments, due to the large supply of subjects, the low wages required, and the fast turnaround time for running an experiment. While certainly not a perfectly representative distribution of humanity, it is much better than most traditional experimental populations, which are generally college undergrads.