The Folly of Fools (24 page)

Read The Folly of Fools Online

Authors: Robert Trivers

BOOK: The Folly of Fools
2.95Mb size Format: txt, pdf, ePub

Of course the errors we make are very numerous. In the words of one psychologist, we can fall short, overreach, skitter off the edge, miss by a mile, take our eyes off the prize, or throw the baby out with the bathwater. And we can exaggerate our accomplishments, diminish our defects, and act vice versa regarding those of others. Many of these may serve self-deceptive functions but not all. Sometimes when we take our eyes off the prize, we have only been momentarily distracted; sometimes when we miss by a mile we have only (badly) miscalculated. At other times, it is precisely our intention to throw out the baby with the bathwater or to miss by a mile, so in principle we have to scrutinize our biases to see which ones serve the usual goal of self-enhancement or, in some other fashion, deception of others, and which ones subserve the function of rational calculation in our direct self-interest.

DENIAL AND PROJECTION

 

Denial and projection are fundamental psychological processes—the deletion (or negation) of reality and the creation of new reality. The one virtually requires the other. Projecting reality may require deleting some, while denial tends to create a hole in reality that needs to be filled. For example, denial of personal malfeasance may by necessity require projection onto someone else. Once years ago while driving I took a corner too sharply and my one-year-old baby fell over in the backseat and started to cry. I heard myself harshly berating her nine-year-old sister (my stepdaughter) for not supporting her—as if she should know by now that I like to take my corners on two wheels. The very harshness of my voice served to signal that something was amiss. Surely the child’s responsibility in this misdemeanor was, at most, 10 percent, the remaining 90 percent lying with me, but since I was denying my own portion, she had to endure a tenfold increase in hers. It is as if there is a “responsibility equation” such that decrease of one portion must necessarily be matched by an increase elsewhere.

A rather more serious example of denial and projection concerns 9/11. Any major disaster has multiple causes and multiple responsible parties. There’s nothing wrong with assigning the lion’s share of cause and responsibility to Osama bin Laden and his men, but what about creating a larger picture that looks back over time and includes us (US citizens) in the model, not so much directly causing it as failing to prevent it? If we were capable of self-criticism, what would we admit to? How did we, however indirectly, contribute to this disaster? Surely through repeated inattention to airline safety (see Chapter 9) but also in our foreign policy.

This final admission is often hardest to make and is almost never made publicly, but sensible societies sometimes guide behavior after the fact in a useful way. It is easy for personal biases to affect one’s answer here, but I will set out what seem to me to be obvious questions. To wit, are there no legitimate grievances against the United States and its reckless and sometimes genocidal (Cambodia, Central America) foreign policy in the past fifty years? Is there any chance that our blind backing of Israel—like all our “client states,” right or wrong, you’re our boys—has unleashed some legitimate anger elsewhere, among, say, Palestinians, Lebanese, Syrians, and those who identify with them or with justice itself? In other words, is 9/11 a signal to us that perhaps we should look at our foreign policy more critically and from the viewpoint of multiple others, not just the usual favored few? One need not mention this in public but can start to make small adjustments in private. Again, the larger message is that exterminating one’s enemies is not the only useful counterresponse to their actions, but becomes so if one’s own responsibility is completely denied and self-criticism aborted.

DENIAL IS SELF-REINFORCING

 

Denial is also self-reinforcing—once you make that first denial, you tend to commit to it: you will deny, deny the denial, deny that, and so on. In the voice-recognition experiments, not only do deniers deny their own voice, they also deny the denial. A person decides that an article on which he is a coauthor is not fraudulent. To do so, he must deny the first wave of incoming evidence, as he duly does. Then comes the second wave. Cave in? Admit fault and cut his losses? Not too likely. Not when he can deny once more and perhaps cite new evidence in support of denial—evidence to which he becomes attached in the next round. He is doubling down at each turn—double or nothing—and as nothing is what he would have gotten at the very beginning, with no cost, he is tempted to justify each prior mistake by doubling down again. Denial leads to denial, with potential costs mounting at each turn.

In trading stock, the three most important rules are “cut your losses, cut your losses, and cut your losses.” This is difficult to do because there is natural resistance. Benefits are nice; we like to enjoy them. But to do so, we must sell a stock after it has risen in value; then we can enjoy the profit. By the same token, we are risk averse. Loss feels bad and is to be avoided. One way to avoid a cost is to hold the stock after it has fallen—loss is only on paper and the stock may soon rebound. Of course, as it sinks lower, one may wish to hold it longer. This style of trading eventually puts one in a most unenviable position, holding a portfolio of losers. Indeed, this is exactly what happens. People trading on their own tend to sell good stocks, buy less good ones, and hold on to their bad ones. Instead, “cut your losses, cut your losses, cut your losses.”

YOUR AGGRESSION, MY SELF-DEFENSE

 

One of the most common cases of denial coupled with projection concerns aggression—who is responsible for the fight? By adding one earlier action by the other party, we can always push causality back one link, and memory is notoriously weak when it comes to chronological order.

An analogy can be found in animal species that have evolved to create the illusion that they are oriented 180 degrees in the opposite direction and are moving backward instead of forward. For example, a beetle has its very long antennae slung underneath its body so they protrude out the back end, creating the illusion of a head. When attacked, usually at the apparent “head” end (that is, the tail) it rushes straight forward—exactly the opposite of what is expected, helping it to escape. Likewise, there are fish with two large, false eyespots on the rear end of their body, creating the illusion that the head is located there. The fish feed at the bottom, moving slowly backward, but again, when attacked at the apparent “head” end, take off rapidly in the opposite direction. What is notable here is that the opposite of the truth (180 degrees) is more plausible than a smaller deviation from the truth (say, a 20-degree difference in angle of motion). And so also in human arguments. Is this an unprovoked attack or a defensive response to an unprovoked attack? Is causation going in this direction, or 180 degrees opposite? “Mommy, he started it.” “Mommy, she did.”

COGNITIVE DISSONANCE AND SELF-JUSTIFICATION

 

Cognitive dissonance refers to an internal psychological contradiction that is experienced as a state of tension or discomfort ranging from minor pangs to deep anguish. Thus, people will often act to reduce cognitive dissonance. The individual is seen to hold two cognitions—ideas, attitudes, or beliefs—that are inconsistent: “Smoking will kill you, and I smoke two packs a day.” The contradiction could be resolved by stopping cigarettes or by rationalizing their use: “They relax me, and they prevent weight gain.” Most people jump to the latter task and start generating self-justification in the face of a much more difficult (if healthier) choice. But sometimes there is only one choice, because the cost has already been suffered: you can rationalize it or live with the truth.

Take a classic case. Subjects were split into two groups, one comprising people who would endure a painful or embarrassing test to join a group and the other people who would pay a modest fee. Then each was asked to evaluate the group based on a tape of a group discussion arranged to be as dull and near-incoherent as possible. Those who suffered the higher cost evaluated the group more positively than did those who paid the small entry fee. And the effect is strong. The low-cost people rated the discussion as dull and worthless and the people as unappealing and boring. This is roughly how the tape was designed to appear. By contrast, those who paid the high cost (reading sexually explicit material aloud in an embarrassing situation) claimed to find the discussion interesting and exciting and the people attractive and sharp.

How does
that
make sense? According to the prevailing orthodoxy, less pain, more gain, and the mind should measure accordingly. What we find is: more pain, more post-hoc rationalization to increase the apparent benefit of the pain. The cost is already gone, and you cannot get it back, but you can create an illusion that the cost was not so great or the return benefit greater. You can choose, in effect, to get that cost back psychologically, and that is exactly what most people do. This particular experiment has been replicated many times with the same result. But it is still not quite clear why this makes sense. Certainly it works in the service of consistency—since you suffered a larger cost, it must have been for a larger benefit. People can be surprisingly unconscious of this effect in their own behavior. Even when the experiment is fully explained and the evidence of individual bias demonstrated, people see that the general result is true but claim that it does not apply to them. They take an internal view of their own behavior, in which lack of consciousness of the manipulating factor means it is not a manipulating factor.

The need to reduce cognitive dissonance also strongly affects our reaction to new information. We like our biases confirmed and we are willing to manipulate and ignore incoming information to bring about that blessed state. This is so regular and strong as to have a name—the confirmation bias. In the words of one British politician, “I will look at any additional evidence to confirm the opinion to which I have already reached.”

So powerful is our tendency to rationalize that negative evidence is often immediately greeted with criticism, distortion, and dismissal so that not much dissonance need be suffered, nor change of opinion required. President Franklin Roosevelt uprooted hundreds of thousands of Japanese-American citizens and interned them for the remainder of World War II, all based on anticipation of possible disloyalty for which no evidence was ever produced except the following classic from a US general: “The very fact that no sabotage has taken place is a disturbing and confirming indication that such action
will
be taken.”

Supplying a balanced set of information to those with divergent views on a subject, as we saw earlier in the case of capital punishment, does not necessarily bring the two sides closer together; quite the contrary. Facts counter to one’s biases have a way of arousing one’s biases. This can lead to those with strong biases being both the least informed and the most certain in their ignorance. In one experiment, people were fed politically congenial misinformation and an immediate correction. Most people believed the evidence more strongly after the refutation.

One important factor affecting the need for cognitive dissonance reduction is post-hoc rationalization of decisions that can no longer be changed. When women are asked to rank a set of household appliances in terms of attractiveness and then offered a choice between two appliances they have ranked equally attractive, they later rank the one they chose as more attractive than the one they rejected, apparently solely based on ownership. A very simple study showing how people value items more strongly after they have committed to them focused on people buying tickets at a racetrack. Right after they bought their ticket, they were much more confident that it was a good choice than while waiting in line with the intention of buying the same ticket. One upshot of this effect is that people like items more when they cannot return them than when they can, despite the fact that they say they like the option to return items.

A bizarre and extreme case of cognitive dissonance reduction occurs in men sentenced to life imprisonment without the possibility of parole for a crime—let us say a spousal murder, using a knife repeatedly. Surprisingly few will admit that the initial act was a mistake. Quite the contrary: they may be aggressive in its defense. “I would do it again in a second; she deserved everything she got.” It is difficult for them to resist reliving the crime, fantasizing again about the victim’s terror, pain, unanswered screams for help, and so on. They are justifying something with horribly negative consequences (for themselves as well now) that they cannot change. Their fate is instead to relive the pleasures of the original mistake, over and over again.

SOCIAL EFFECTS OF COGNITIVE DISSONANCE REDUCTION

 

The tendency of cognitive dissonance resolution to drive different individuals apart has been described in terms of a pyramid. Two individuals can begin very close on a subject—at the top of a pyramid, so to speak—but as contradictory forces of cognitive dissonance come into play and self-justification ensues, they may slide down the pyramid in different directions, emerging far apart at the bottom. As two experts on the subject put it:

We make an early, apparently inconsequential decision and then we justify it to reduce the ambiguity of the approach. This starts a process of entrapment—action, justification, further action—that increases our intensity and commitment and may take us far from our original intentions or principles.

 

As we saw in Chapter 5, this process may be an important force driving married couples toward divorce rather than reconciliation. What determines the degree to which any given individual is prone to move down the pyramid when given the choice is a very important (unanswered) question.

A novel implication of cognitive dissonance concerns the best way to turn a possible foe into a friend. One might think that giving a gift to another would be the best way to start a relationship of mutual giving and cooperation. But it is the other way around—getting the other person to give you a gift is often the better way of inducing positive feelings toward you, if for no other reason than to justify the initial gift. This has been shown experimentally where subjects cajoled into giving a person a gift later rate that person more highly than those not so cajoled. The following folk expression from more than two hundred years ago captures the counterintuitive form of the argument (given reciprocal altruism):

Other books

Empires and Barbarians by Peter Heather
Artist's Daughter, The: A Memoir by Alexandra Kuykendall
Lies My Girlfriend Told Me by Julie Anne Peters
Vineyard Blues by Philip R. Craig
Phoenix Without Ashes by Edward Bryant, Harlan Ellison
Vampire in Paradise by Sandra Hill
Obsession by Tory Richards
Falling Forward by Olivia Black