Caveman Logic: The Persistence of Primitive Thinking in a Modern World (41 page)

BOOK: Caveman Logic: The Persistence of Primitive Thinking in a Modern World
4.21Mb size Format: txt, pdf, ePub
And, speaking of language, here is an analogy. In general, people want to be able to make a distinction between saying “you” in the singular versus plural form. Most languages allow them to do this. In Spanish, for example, the words
usted
and
ustedes
mark the difference. In Hebrew, the distinction is between
ata
and
atem
. In Persian, the word
shoma
means the singular you, and
shomaha
is used for the plural case. In Greek, the distinction is between
esi
and
esis
. In Italian, it’s
lei
and
voi
.
Unfortunately, English is not one of those languages that differentiates between the singular and plural form of “you.” Since there is no formal solution to the problem, many strategies crop up colloquially. Perhaps the most famous of these is the Southern term
you all
(shortened to
y’all
)—a collective pronoun if there ever was one. It even shows up in the possessive form (“Is that y’all’s house?”). In sections of New York such as Brooklyn and the Bronx, the plural term
youse
(often corrupted to
yiz
) handles the task. In parts of Canada, including Ontario and Manitoba,
youse
is also the plural pronoun of choice (e.g., “Would youse like a drink before dinner?”). In western Pennsylvania, the term
you’uns
is used. In Scotland, the variant
youse’uns
has been reported.
In general, a shortfall in the English language seems to go against the grain and has given rise to a host of regional solutions, none of which is officially sanctioned. What is interesting is that in all of these cases the use of an unofficial plural form is tied to socioeconomic status. Educated or professional classes are far less likely to remedy the linguistic problem with a colorful neologism. A former graduate student of mine came from a working-class background and was the first person in his family to attend university. One day he was complaining to me about faculty policy and commented, “Youse never give the first-year students: . . .” I stopped him and pointed out his use of the language, offering an abbreviated version of my “Lose the Youse” speech. I suggested that, in terms of his success, sounding educated might at times be as important as being educated. He responded by saying, “I can’t believe I said that. I didn’t even hear it. Everybody I know back home says
youse
. I want to leave that behind. Please point it out to me the next time I do it.” To the best of my recollection, he never did, at least around me.
The point of this story and the whole digression into plural pronouns is this: the mind seems to want a plural form of “you” for the sake of communication. When its native language lets it down, the mind will create and/or embrace any reasonable solution. But the activity is strictly optional. The compelling linguistic force that draws us to use “youse,” if you will, or some plural form where none exists, can be resisted. The force to alter language is obviously not as powerful as the one that compels many other forms of Caveman Logic. In the case of language, a regional speech pattern can be overcome or avoided altogether by exposure to social incentives for using so-called correct English. That’s all it takes. My student was able to will himself out of two decades of socialization by “youse” users in order to sound more like the educated man he was becoming. Avoiding most of the mental mistakes we describe in this book may require more work than that. Many people have been socialized to commit these mistakes. Both the mental errors and the beliefs they lead to feel extremely natural. Worse yet, their
absence
may feel quite unnatural. Nevertheless, it can be done. There are those who do not regularly engage in these mental missteps, and there is no reason to believe that such persons have different cognitive architecture than the rest of us. Moreover, they do not all come from “enlightened” backgrounds. Turning again to my former student, it was clear that he wanted to move beyond where he had been. That simple bit of motivation may be a crucial prerequisite for change.
THE BETTER ANGELS OF OUR NATURE
“The better angels of our nature.” What a wonderful phrase. It appears in the final paragraph of Lincoln’s first inaugural address in 1861. If Lincoln took it from an earlier source, the reference has escaped my attention. The phrase has been borrowed countless times since, often as part of sermons or commencement addresses. It is plainly inspirational and often appears along with the words “appeal to.” Special people or special circumstances are considered appeals to the “better angels of our nature.” I have always understood the phrase to mean that under certain conditions we should go beyond what comes easily. Whether in thought or action, we should dig more deeply into the information or—if you will—into ourselves and think or act in a way that is somehow more highly evolved or enlightened. By definition this will not come easily. To some it will come not at all.
It will certainly take more work and it may not always receive much in the way of social support. But despite the difficulty and the lack of consensus, we will somehow know that this path is better: something we can be proud of. Some might describe it as higher or purer. The phrase “better angels” suggests that not all the angels that inspire us are created equal. Some of those alternative angels may be “worse,” even if their call is to a well-traveled path that comes more naturally.
Nowhere is it suggested that we must summon the better angels of our nature all the time. But it is the hope of such an appeal that we can rally this extra energy when it really matters. At least we know it is an available option. We know that sometimes it really is OK to decline those default settings or shortcuts with which natural selection has imbued our minds. In other words, using the language of cognitive psychology, it is sometimes OK to use the algorithm and forget the heuristic. It is sometimes OK to think about what we’re really seeing and discount or second-guess the conclusion that is forcing its way into our consciousness like a mindless brute. Even though the same brute is doing his work on the minds of our friends and family, we can resist the social pressure and consider the better angels of our nature. Doing what comes naturally, seeing those faces in the clouds, interpreting that quarter on the sidewalk as a “sign” from the universe—those are default Pleistocene settings. They are not our better angels. They are what natural selection, that ruthless efficiency expert, has trip-wired our minds to do.
It would be easier to face the world without that Stone Age mess rattling around in our heads, but we can still express our “better angels.” We can acknowledge the default settings in our minds, but not relinquish control to them on a reflexive basis. We have enough cognitive flexibility to act, much of the time, as if we had evolved to a higher level. We may still see the face in the pizza when we look quickly, but we do not have to act on that perceptual impulse. We do not have to form prayer vigils outside the pizzeria. We do not have to beseech the Holy Pepperoni to intervene in our lives and make them better. We do not have to vote for politicians who include Pepperoni references in their campaign speeches. We do not have to fall under the leadership of pizza-prophets and kill those who do not see the same face we do, or perhaps see a different face in a jelly donut.
JUST IMAGINE
We have examined the roots of Caveman Logic, we have looked at its manifestations in our present society, and we have considered strategies for overcoming its worst effects. The simple question remains: Will we succeed?
Not surprisingly, others have addressed this issue. They have examined the superstitious, the irrational, and the dangerously wrongheaded things humans continue to believe and do. Most authors like myself are implicitly optimistic in the sense that they offer suggestions for improvement. If we didn’t believe in the possibility of enlightenment and change, we wouldn’t spend thousands of hours writing our books. By analogy, I do not believe I am holding a red card in front of a color-blind person, repeatedly shouting, “See it, damn you!” I am not concerned with the absence of an ability, but with its overapplication. I am asking people to be more circumspect in their use of mental tendencies that come far too easily to them.
But optimism isn’t the only attitude you’ll find. When asked to speculate directly on the irrational aspect of human nature, other psychologists and philosophers have sounded downright pessimistic. David Hume wrote of irrationality, “Though this inclination may at intervals receive a check from sense and learning, it can never be thoroughly extirpated from human nature.”
3
Other authors, Carl Sagan
4
among them, decry the fact that we have never been more educated and informed, yet we seem hell-bent for another Dark Age of fear and superstition.
What if Sagan is right? Isn’t it a liberal edict that everyone is entitled to his or her belief system? Aren’t ignorance and stupidity God-given rights, so to speak? Perhaps, but as such ignorance moves into the mainstream and has implications that threaten all of us, isn’t it time to stand up and say so? As neuroscientist and author Sam Harris writes, “Half of the American population believes that the universe is 6,000 years old. They are wrong about this. Declaring them so is not ‘irreligious intolerance.’ It is intellectual honesty.”
5
Moreover, doesn’t such ignorance become more egregious as alternative knowledge becomes more readily available? Few of us would condemn a man in 1308 for professing the superstitious misinformation of his age. But how do we react to persons among us professing those same beliefs seven hundred years later? Or demanding that their beliefs be taught to their children in public institutions? If it is
still
too soon to be sure about what is correct, can we foresee a time in the future when the weight of evidence will be so overwhelming that it will be OK to say, “This is right, and this—although some clung tenaciously to it for a while—is wrong.” A time when, other than teaching earlier beliefs as part of the history of human folly, we can simply say no to continuing to offer them equal time in our public institutions?
When Shakespeare wrote, “The fault, dear Brutus, lies not in our stars but in ourselves,” he was creating what might have been a mantra to keep us from magical thinking and overactive agency detectors. Unfortunately, although his quotation is still widely known, its sentiments have been just as widely ignored.
Try to imagine a world where 95 percent of the population is voluntarily atheistic. Where rampant belief in supernatural agency is not socially supported. Where the names of supernatural agents are not written on government-issued monetary units. Where political leaders do not invoke the names of deities at the conclusion of speeches or use the blessing of such deities in justifying state-sanctioned wars.
Imagine a world in which natural disasters, accidents, illness, and death are viewed as unfortunate but normal occurrences in a lawful universe, and not as evidence of supernatural vengeance or intervention.
Imagine a world where people routinely accept the fact that some important events lie outside their control. They make no attempt to invent supernatural agents who can be persuaded to intervene on their behalf.
Imagine a world where the large majority of humans lead moral lives because of personal or social codes, not because they fear supernatural retribution.
Imagine a world where one’s identity is not so “tribal” that hatred and violence against “out-group” members (on the basis of religion or nationality) can be readily triggered by pandering preachers or politicians.
Imagine a world in which humans accept that they have a finite life span. Although it is longer at present than it has ever been before, it is nevertheless measured in decades, not eternities. That after death, one’s body ceases to function and begins to decompose. That while one may be lovingly remembered for past deeds, the days as an active agent, mentally and physically, will be over.
Can you imagine any of this? Certainly, it is a task for the imagination because very little of it is true today. Some of us may be capable of some of these things some of the time, but few of us can do it all routinely. And if you can, you will find little institutional support. That is a shame. I view each of these things as stepping stones toward human improvement. Collectively, they would represent forward movement in the evolution of our species toward behavior and understanding that are truly worthy of our name,
Homo sapiens
. At present, we are a pale imitation of that name.
Certainly, such a world would be very different from the one in which most of us live. Some like Carl Sagan are optimistic enough to believe that with some combination of improved education and social support, humans can be dragged out of the Dark Age into which their present culture seems to be descending. But others are more pessimistic. They question whether we, as a species, are capable of even the six entries listed above. Are they a reachable goal for our species, or are they the sole domain of just a rarified, enlightened few, who are unlikely to succeed in sharing their wisdom with others?
Perhaps someday the majority of humans will not see patterns that are not there or turn to supernatural agents when control seems out of their reach. But that day is not yet on the horizon. Arguably it would require such fundamental changes that we may be looking at a different species, in the same way that
Australopithecus
was a different species of hominin than
Neandertalus.
Someone will have to name this new, less-superstitious, more clearheaded human species. Unfortunately the name
Homo sapiens
will already have been squandered.
This is a pessimistic view, and yet it may be realistic. To the extent that my shopping list of criteria for human improvement seems unimaginable, you may already share this pessimism. Would a change of this magnitude require reeducation or physical change? Could it be accomplished at the level of memes or would selection pressure on cognitive architecture be required for such perceptual and cognitive change? Certainly, it is not as simple as electing new leadership. As deplorable as the present state of affairs may appear to some, it resulted from human choices. Likewise, the emergence of fundamentalist religion as a major factor in world politics in the twenty-first century was not imposed on us by extraterrestrials; it reflects fundamental aspects of human nature. In that sense, there has been little change in the past two thousand years.

Other books

Havana Bay by Martin Cruz Smith
Disenchanted by Robert Kroese
Becoming Jinn by Lori Goldstein
Redemption by Gordon, H. D.
Crossfire by Francis, Dick;Felix Francis
The League of Spies by Aaron Allston
Twisted by Uvi Poznansky
On Any Given Sundae by Marilyn Brant
WALLS OF THE DEAD by MOSIMAN, BILLIE SUE
El hombre unidimensional by Herbert Marcuse