Wired for Culture: Origins of the Human Social Mind (33 page)

Read Wired for Culture: Origins of the Human Social Mind Online

Authors: Mark Pagel

Tags: #Non-Fiction, #Evolution, #Sociology, #Science, #21st Century, #v.5, #Amazon.com, #Retail

BOOK: Wired for Culture: Origins of the Human Social Mind
8.48Mb size Format: txt, pdf, ePub

Natural selection and Paley’s argument were the inspirations for Richard Dawkins’ metaphor and book
The
Blind Watchmaker
, an account of how natural selection “blindly” produces complex organs and other traits from random mutations. Our discussion of social learning should make us realize that Paley’s argument is also wrong when applied to our cultural world, or at least it is not nearly as right as we would like to think. This is because the great sorting power of social learning is also, in principle, a blind watchmaker, or nearly so, and in this case it really is building watches, not complicated organisms. Like genes mutating, our capacity to be “inventive” is just a way of generating varieties that social learning can sort through. Just so long as we have a way to generate a variety of outcomes for cultural evolution to act on, and an ability to recognize a good outcome when we see it, social learning can blindly do the rest, even if the mechanism that generates the variety—our so-called inventiveness—is random. Even our ability to recognize good outcomes need not be very good; in fact, it also hardly needs to be better than random itself for good ideas to spread. Social learning and the cultural evolution it drives is responsible for our cars and toasters, our comfortable sofas and beds, pencils and paper, trains, alarm clocks, breakfast cereal, anything made of metal, computers and space shuttles—in short, just about everything in our daily lives—and yet few of us “comprehend their construction.”

The Oxford chemist Peter Atkins in
The Creation
elegantly summed up evolution by natural selection, saying, “A great deal of the universe does not need any explanation… . Once molecules have learnt to compete, and to create other molecules in their own image, elephants, and things resembling elephants, will in due course be found roaming the countryside.” To paraphrase Atkins, once ideas have to compete in our minds, things like toasters, computers, and space shuttles will in due course just appear—they are inevitable and do not need any explanation. Still, in practice, we would expect to evolve to be better than random both at generating innovations and at recognizing good outcomes, because we must compete with others, and especially with other groups. But the message of social learning is that we can be far less inventive than we give ourselves credit for and yet still expect toasters and computers to appear if we wait long enough. Even Einstein purportedly said, “I have no special talents. I am only passionately curious.” We can grant Einstein this conceit, and still realize that because of social learning, being sneaky or at least shrewd at using what others have invented might have tempered our inventiveness.

I was once in the Australian Outback and met, through an interpreter, an Aboriginal man called Sammy. Sammy was showing me how his tribe made a hafted knife by heating a plant over fire to extract a sap or pitch that they used to glue sharp pieces of stone to a wooden handle. Once glued, it was further secured by wrapping it with a long fiber from a local plant. The interpreter said that Sammy was happy to answer questions. I had noticed that the way Sammy notched the handle to hold the blade didn’t seem likely to produce a strong bond. I asked him why he didn’t cut a slit in the wooden handle to insert the blade down into before gluing it. He replied that his tribe had always done it his way. I don’t tell this story to suggest that I had a better alternative, or that I had somehow spotted a flaw in a piece of cultural technology that his tribe might have used for millennia. There was probably a very good reason why they produced the tool as they did. My point is that Sammy didn’t seem to know what that reason was (or he simply couldn’t be bothered to tell this ignorant interloper one of his tribal secrets).

There is no reason to think that Sammy is unusual. Social learning provides a niche or role for innovators in society, but it might be a small one, because innovation isn’t easy and the rest of us can get by just fine copying them. That niche will stay the same size even as population sizes increase because social learning means a little bit of creativity goes a long way. It is a sad commentary, but most of us today might be little more than glorified karaoke singers in most aspects of our lives. Indeed, it is possible today to be almost entirely lacking in any sort of ingenuity and yet still get on in society just fine. Most of us spend nearly all of our lives doing things others have taught us and using things others have made. The people who made the things we use almost certainly had little understanding of what they were doing when they made them, having inherited most of the information from their ancestors or predecessors.

So complicated are machines like NASA’s space shuttle or the software operating systems of computers that no one knows how they work. Entire teams of people are needed to keep them going, and even then they rely on protocols, checklists, and sets of instructions carefully worked out by others before them. In 2002, NASA even went scavenging on eBay to find a large quantity of old medical equipment containing the Intel Corporation’s 8086 computer chip. This is the chip that IBM used in its personal computer in 1981. It is millions of times slower and less efficient than modern computer chips, so why did NASA want it so badly? NASA’s computer software for testing the space shuttle’s critical booster rockets had originally been written for the 8086 chip, and the shuttle fleet would have to be grounded unless this software could be run. Once such critical software has been written, cleared of bugs and other errors, and rigorously tested to see if it works, engineers are almost superstitiously fearful of altering it or trying it out on a different chip for fear it might fail. These pieces of software are so complicated that no one can simply rewrite or alter them, and no one can be sure they will work with 100 percent reliability on a new machine. In fact, NASA put out plans in 2002 for a $20 million project finally to upgrade this system.

It has probably been like this for tens of thousands of years. In
The Evolution of Technology
, George Basalla rejects the heroic innovator view of technological changes in favor of a Darwinian view of gradual change. He documents how well-known technologies, including transistors, steam engines, hammers, and chopping tools, all trace their history back through many small successive changes, or combine elements of other technologies. And it is true: how many truly innovative thoughts have you had, thoughts that might have made a difference in the history of cultural evolution? Or if that sets the bar too high, how many thoughts have you had that others would wish to copy, like a better way to shape a hand ax, or how to weave, or make a better spear or soufflé? How many people do you know who have?

Our individual capacity for inventiveness, even in the sciences, might be far less developed than we like to think. In the entire history of science and natural philosophy, the list of people whose ideas have profoundly shaped our lives is short. The remarkable ability of social learning to sift ideas has meant that a few great innovators can go a long way, so that most of us simply aren’t very good at it. The historian of science David Edgerton has written: “It is not sufficiently recognized that creation, scientific or otherwise, is a tragic business. Most inventions meet nothing but indifference, even from experts. Patents are little more than a melancholy archive of failure. Most ideas of every sort are rejected, as would be clear if there was a repository for abandoned drafts, rejected manuscripts, unperformed plays and unfilmed treatments.”

The Romans are not known as great innovators, but they were skillful copiers of art and culture, and often to their benefit. The second-century
BC
Greek historian Polybius in his
World History
recounts how the outcome of the First Punic Wars between Carthage (roughly present-day Tunisia) and Rome might have been determined by a fortuitous accident that put Carthaginian technology into Roman hands. Polybius describes a Carthaginian ship in hot pursuit of Roman ships:

On this occasion the Carthaginians put to sea to attack them as they were crossing the straits, and one of their decked ships advanced too far in its eagerness to overtake them and running aground fell into the hands of the Romans. This ship they now used as a model, and built their whole fleet on its pattern; so that it is evident that if this had not occurred they would have been entirely prevented from carrying out their design by lack of practical knowledge.

With their fleet of new ships modeled on those of the Carthaginians, the Romans went on to defeat them and occupy Sicily.

Imitation is hard-wired into our brains and available to us from infancy. Try sticking your tongue out at a baby and you might be surprised that it returns the gesture. But consider what even this simple act of imitation requires. The baby’s eyes have to transduce the light rays that bounce off your tongue into electrical signals that get sent via its optic nerve to its brain. Then somewhere in the brain and by means that no one yet understands, that information about your tongue causes a new set of electrical signals to get sent down nerves to the baby’s mouth and tongue, where they cause precise muscular movements that rely on coordinating the actions of many muscles. The process is not “linear”—the baby’s brain has made up a set of instructions that cannot be derived from a simple alteration of the input from its eyes. And yet, the baby does all this on its first go and without practice. In fact, there are some suggestions that humans are hyper-imitators. We seem to imitate so precisely that we sometimes imitate actions that are not strictly necessary to the task to be accomplished. The machinery to produce such imitation must be complicated and expensive to own and maintain, and this tells us that imitation and copying have played an important role in our species’ survival and prosperity.

Before we leave this topic it might be useful to point out that any shortcomings we have at being inventive or innovative are likely to be magnified in our modern world. This is because it is not necessary for the numbers of innovators in society to keep pace with increases in the size of the population—many people can happily get by copying just one good innovator. This effect is enhanced by language and writing, both of which transmit ideas and innovations well beyond those who came up with them. And this raises a serious question about the kind of dispositions and temperaments that our modern world will encourage. As our societies become ever more connected and “globalized” it will become increasingly easy for most of us not to innovate at all—to become intellectually lazy and docile, at least in matters of inventiveness. The irony is that this might be happening at a time when more innovation is needed than ever before just to maintain the levels of prosperity many of us already enjoy, and to raise it for those who have, up to now, been less fortunate.

BIG BRAINS AND THE SOCIAL ARMS RACE

IF INDIVIDUAL
inventiveness per se hasn’t played the role in shaping our intelligence that we might have thought, a capability that is difficult for others to copy by social learning might have. That capability might turn out to be our social intelligence. To see why, we need to introduce the idea of a “moving” or “unpredictable target.” We use our brains, as do all animals, to confront our environment, to exploit animals and plants, and to compete with each other. Finding food, fighting off disease, managing the vagaries of weather, and avoiding being eaten by other species are all things Darwin would have called “hostile forces of nature.” But many of these hostile forces are predictable or slowly changing, and so natural selection can respond by giving organisms ways to defend ourselves against them. A plant’s spiny surface or the toxins it produces in its leaves to discourage grazing animals can normally be neutralized by evolving some sort of defense. Maybe an animal acquires a particular gut enzyme that breaks down the toxin or develops a tough lining of the mouth. The plant will then normally respond by changing its spines or toxin to thwart the grazing animal.

In other instances, animals must use their brains to outwit other animals, but the process can still be quite predictable. No matter, for example, how many crocodiles are in a river, if a gazelle gets thirsty enough, it will go to the river for a drink. Here outwitting becomes a matter of adopting a strategy that takes advantage of the other animal’s predictable behavior. For the crocodile, it is to wait motionless for long periods of time. For the gazelle, it is to pick areas of the river where crocodiles are more easily spotted, such as shallows. These strategies and counterstrategies can normally be achieved by following rules, and this is probably what governs the behavior of most animals—rules, schema, or “algorithms” that the brain can follow in a contingent way. The brain is programmed to do one thing in one circumstance and a different thing in another. It is the sort of behavior that we can program into robots. And it is the sort of behavior that when the circumstances that the algorithm evolved for change ever so slightly, the algorithms can cause animals to do things that seem stupid to us. A good example is that of dogs straining at a leash to get to something just out of their reach. Natural selection has equipped dogs with an algorithm to charge at things they want. It never considered they might be on a leash.

Carnivore species have to outwit their herbivore prey, and in general they have larger brains for the size of their bodies than their prey do. Being a carnivore means getting up each morning having to think hard about how to find the next meal, and in particular having to outwit the animals you seek to kill and eat. By comparison, a grazing animal such as an antelope, upon awakening, normally finds its next meal in front of its nose. That, and the relatively larger numbers of grazing animals in a typical herd compared to their predators, means that each individual grazing animal need not think as hard about avoiding predators as the predators have to think about catching them. Indeed, this is one of the chief reasons that grazing animals form into herds, so much so that they have even been described as “selfish herds.” As we might expect from this asymmetry, fossil skulls of carnivores and their herbivore prey show that throughout history the carnivores have had larger brains relative to the size of their bodies. But the fossils also show what looks like an arms race—when one move is met by a countermove—between the brains of these two groups. Over a period of around 60 million years, herbivore brain size has increased, and these increases have been met by increases in carnivore brain size, with carnivores always maintaining their relative edge. The arms race has been competitive enough that an average herbivore today has a larger brain for its body size than an average carnivore did 40 to 60 million years ago.

Other books

Wolfe's Mate by Caryn Moya Block
Rock and a Hard Place by Angie Stanton
The Sexual History of London by Catharine Arnold
The Giant's House by Elizabeth McCracken
Roma Mater by Poul Anderson
SEAL Forever by Anne Elizabeth
Victory of Eagles by Naomi Novik