Read Dreams of Earth and Sky Online
Authors: Freeman Dyson
Cognitive illusions are the main theme of his book. A cognitive illusion is a false belief that we intuitively accept as true. The illusion of validity is a false belief in the reliability of our own judgment. The interviewers sincerely believed that they could predict the performance of recruits after talking with them for fifteen minutes. Even after the interviewers had seen the statistical evidence that their belief was an illusion, they still could not help believing it. Kahneman confesses that he himself still experiences the illusion of validity, after fifty years of warning other people against it. He cannot escape the illusion that his own intuitive judgments are trustworthy.
An episode from my own past is curiously similar to Kahneman’s experience in the Israeli army. I was a statistician before I became a scientist. At the age of twenty I was doing statistical analysis of the operations of the British Bomber Command in World War II. The command was then seven years old, like the State of Israel in 1955. All its institutions were under construction. It consisted of six bomber groups that were evolving toward operational autonomy. Air Vice-Marshal Sir Ralph Cochrane was the commander of 5 Group, the most independent and effective of the groups. Our bombers were then taking heavy losses, the main cause of loss being the German night fighters.
Cochrane said the bombers were too slow, and the reason they were too slow was that they carried heavy gun turrets that increased
their aerodynamic drag and lowered their operational ceiling. Because the bombers flew at night, they were normally painted black. Being a flamboyant character, Cochrane announced that he would like to take a Lancaster bomber, rip out the gun turrets and all the associated dead weight, ground the two gunners, and paint the whole thing white. Then he would fly it over Germany, and fly so high and so fast that nobody could shoot him down. Our commander in chief did not approve of this suggestion, and the white Lancaster never flew.
The reason why our commander in chief was unwilling to rip out gun turrets, even on an experimental basis, was that he was blinded by the illusion of validity. This was ten years before Kahneman discovered it and gave it its name, but the illusion of validity was already doing its deadly work. All of us at Bomber Command shared the illusion. We saw every bomber crew as a tightly knit team of seven, with the gunners playing an essential role defending their comrades against fighter attack, while the pilot flew an irregular corkscrew to defend them against flak. An essential part of the illusion was the belief that the team learned by experience. As they became more skillful and more closely bonded, their chances of survival would improve.
When I was collecting the data in the spring of 1944, the chance of a crew reaching the end of a thirty-operation tour was about 25 percent. The illusion that experience would help them to survive was essential to their morale. After all, they could see in every squadron a few revered and experienced old-timer crews who had completed one tour and had volunteered to return for a second tour. It was obvious to everyone that the old-timers survived because they were more skillful. Nobody wanted to believe that the old-timers survived only because they were lucky.
At the time Cochrane made his suggestion of flying the white Lancaster,
I had the job of examining the statistics of bomber losses. I did a careful analysis of the correlation between the experience of the crews and their loss rates, subdividing the data into many small packages so as to eliminate effects of weather and geography. My results were as conclusive as those of Kahneman. There was no effect of experience on loss rate. So far as I could tell, whether a crew lived or died was purely a matter of chance. Their belief in the lifesaving effect of experience was an illusion.
The demonstration that experience had no effect on losses should have given powerful support to Cochrane’s idea of ripping out the gun turrets. But nothing of the kind happened. As Kahneman found out later, the illusion of validity does not disappear just because facts prove it to be false. Everyone at Bomber Command, from the commander in chief to the flying crews, continued to believe in the illusion. The crews continued to die, experienced and inexperienced alike, until Germany was overrun and the war finally ended.
Another theme of Kahneman’s book, proclaimed in the title, is the existence in our brains of two independent systems for organizing knowledge. Kahneman calls them System 1 and System 2. System 1 is amazingly fast, allowing us to recognize faces and understand speech in a fraction of a second. It must have evolved from the ancient little brains that allowed our agile mammalian ancestors to survive in a world of big reptilian predators. Survival in the jungle requires a brain that makes quick decisions based on limited information. Intuition is the name we give to judgments based on the quick action of System 1. It makes judgments and takes action without waiting for our conscious awareness to catch up with it. The most remarkable fact about System 1 is that it has immediate access to a vast store of memories that it uses as a basis for judgment. The memories that are most accessible are those associated with strong emotions, with fear and pain and hatred. The resulting judgments are
often wrong, but in the world of the jungle it is safer to be wrong and quick than to be right and slow.
System 2 is the slow process of forming judgments based on conscious thinking and critical examination of evidence. It appraises the actions of System 1. It gives us a chance to correct mistakes and revise opinions. It probably evolved more recently than System 1, after our primate ancestors became arboreal and had the leisure to think things over. An ape in a tree is not so much concerned with predators as with the acquisition and defense of territory. System 2 enables a family group to make plans and coordinate activities. After we became human, System 2 enabled us to create art and culture.
The question then arises: Why do we not abandon the error-prone System 1 and let the more reliable System 2 rule our lives? Kahneman gives a simple answer to this question: System 2 is lazy. To activate System 2 requires mental effort. Mental effort is costly in time and also in calories. Precise measurements of blood chemistry show that consumption of glucose increases when System 2 is active. Thinking is hard work, and our daily lives are organized so as to economize on thinking. Many of our intellectual tools, such as mathematics and rhetoric and logic, are convenient substitutes for thinking. So long as we are engaged in the routine skills of calculating and talking and writing, we are not thinking, and System 1 is in charge. We only make the mental effort to activate System 2 after we have exhausted the possible alternatives.
System 1 is much more vulnerable to illusions, but System 2 is not immune to them. Kahneman uses the phrase “availability bias” to mean a biased judgment based on a memory that happens to be quickly available. It does not wait to examine a bigger sample of less cogent memories. A striking example of availability bias is the fact that sharks save the lives of swimmers. Careful analysis of deaths in
the ocean near San Diego shows that on average, the death of each swimmer killed by a shark saves the lives of ten others. Every time a swimmer is killed, the number of deaths by drowning goes down for a few years and then returns to the normal level. The effect occurs because reports of death by shark attack are remembered more vividly than reports of drownings. System 1 is strongly biased, paying more prompt attention to sharks than to riptides that occur more frequently and may be equally lethal. In this case, System 2 probably shares the same bias. Memories of shark attacks are tied to strong emotions and are therefore more available to both systems.
Kahneman is a psychologist who won a Nobel Prize in Economics. His great achievement was to turn psychology into a quantitative science. He made our mental processes subject to precise measurement and exact calculation, by studying in detail how we deal with dollars and cents. By making psychology quantitative, he incidentally achieved a powerful new understanding of economics. A large part of his book is devoted to stories illustrating the various illusions to which supposedly rational people succumb. Each story describes an experiment, examining the behavior of students or citizens who are confronted with choices under controlled conditions. The subjects make decisions that can be precisely measured and recorded. The majority of the decisions are numerical, concerned with payments of money or calculations of probability. The stories demonstrate how far our behavior differs from the behavior of the mythical “rational actor” who obeys the rules of classical economics.
A typical example of a Kahneman experiment is the coffee mug experiment, designed to measure a form of bias that he calls the “endowment effect.” The endowment effect is our tendency to value an object more highly when we own it than when someone else owns it. Coffee mugs are intended to be useful as well as elegant, so that people
who own them become personally attached to them. A simple version of the experiment has two groups of people, sellers and buyers, picked at random from a population of students. Each seller is given a mug and invited to sell it to a buyer. The buyers are given nothing and are invited to use their own money to buy a mug from a seller. The average prices offered in a typical experiment were: sellers $7.12, buyers $2.87. Because the price gap was so large, few mugs were actually sold.
The experiment convincingly demolished the central dogma of classical economics. The central dogma says that in a free market, buyers and sellers will agree on a price that both sides regard as fair. The dogma is true for professional traders trading stocks in a stock market. It is untrue for nonprofessional buyers and sellers because of the endowment effect. Trading that should be profitable to both sides does not occur, because most people do not think like traders.
Our failure to think like traders has important practical consequences, for good and for evil. The main consequence of the endowment effect is to give stability to our lives and institutions. Stability is good when a society is peaceful and prosperous. Stability is evil when a society is poor and oppressed. The endowment effect works for good in the German city of Munich. I once rented a home there for a year, a few miles from the city center. Across the street from our home was a real farm with potato fields and pigs and sheep. The local children, including ours, went out to the fields after dark, made little fires in the ground, and roasted potatoes. In a free-market economy, the farm would have been sold to a developer and converted into a housing development. The farmer and the developer would both have made a handsome profit. But in Munich, people were not thinking like traders. There was no free market in land. The city valued the farm as public open space, allowing city dwellers to walk over grass all the way to the city center, and allowing our children to roast potatoes at night. The endowment effect allowed the farm to survive.
In poor agrarian societies, such as Ireland in the nineteenth century or much of Africa today, the endowment effect works for evil because it perpetuates poverty. For the Irish landowner and the African village chief, possessions bring status and political power. They do not think like traders, because status and political power are more valuable than money. They will not trade their superior status for money, even when they are heavily in debt. The endowment effect keeps the peasants poor and drives those of them who think like traders to emigrate.
At the end of his book, Kahneman asks the question: What practical benefit can we derive from an understanding of our irrational mental processes? We know that our judgments are heavily biased by inherited illusions, which helped us to survive in a snake-infested jungle but have nothing to do with logic. We also know that, even when we become aware of the bias and the illusions, the illusions do not disappear. What use is it to know that we are deluded, if the knowledge does not dispel the delusions?
Kahneman answers this question by saying that he hopes to change our behavior by changing our vocabulary. If the names that he invented for various common biases and illusions, “illusion of validity,” “availability bias,” “endowment effect,” and others that I have no space to describe here, become part of our everyday vocabulary, then he hopes to see the illusions lose their power to deceive us. If we use these names every day to criticize our friends’ mistaken judgments and to confess our own, then perhaps we will learn to overcome our illusions. Perhaps our children and grandchildren will grow up using the new vocabulary and will automatically correct their congenital biases when making judgments. If this miracle happens, then future generations will owe a big debt to Kahneman for giving them a clearer vision.
One thing that is notably absent from Kahneman’s book is the
name of Sigmund Freud. In thirty-two pages of endnotes there is not a single reference to his writings. This omission is certainly no accident. Freud was a dominating figure in the field of psychology for the first half of the twentieth century, and a fallen tyrant for the second half of the century. In the article on Freud in Wikipedia, we find quotes from the Nobel Prize–winning immunologist Peter Medawar—psychoanalysis is the “most stupendous intellectual confidence trick of the twentieth century”—and from Frederick Crews:
Step by step, we are learning that Freud has been the most overrated figure in the entire history of science and medicine—one who wrought immense harm through the propagation of false etiologies, mistaken diagnoses, and fruitless lines of enquiry.
In these quotes, emotions are running high. Freud is now hated as passionately as he was once loved. Kahneman evidently shares the prevalent repudiation of Freud and of his legacy of writings.
Freud wrote two books,
The Psychopathology of Everyday Life
in 1901 and
The Ego and the Id
in 1923, which come close to preempting two of the main themes of Kahneman’s book. The psychopathology book describes the many mistakes of judgment and of action that arise from emotional bias operating below the level of consciousness. These “Freudian slips” are examples of availability bias, caused by memories associated with strong emotions.
The Ego and the Id
decribes two levels of the mind that are similar to the System 2 and System 1 of Kahneman, the ego being usually conscious and rational, the id usually unconscious and irrational.