Brain Buys

Read Brain Buys Online

Authors: Dean Buonomano

BOOK: Brain Buys
12.37Mb size Format: txt, pdf, ePub
Brain Bugs
Brain Bugs

HOW THE BRAIN’S FLAWS SHAPE OUR LIVES

Dean Buonomano

W. W. NORTON & COMPANY
NEW YORK
LONDON

Copyright © 2011 by Dean Buonomano

All rights reserved

For information about permission to reproduce selections from this book, write to Permissions, W. W. Norton & Company, Inc., 500 Fifth Avenue, New York, NY 10110

Library of Congress Cataloging-in-Publication Data

Buonomano, Dean.
Brain bugs: how the brain’s flaws shape our lives / Dean Buonomano.—1st ed.
p. cm.
Includes bibliographical references and index.
ISBN: 978-0-393-08195-4
1. Brain—Physiology. 2. Memory—Physiological aspects. I. Title.
QP376.B86 2011
612.8’2—dc23

2011014934

W. W. Norton & Company, Inc.
500 Fifth Avenue, New York, N.Y. 10110
www.wwnorton.com

W. W. Norton & Company Ltd.
Castle House, 75/76 Wells Street, London W1T 3QT

To my parents, Lisa, and Ana

Brain Bugs
Introduction

It has been just so in all my inventions. The first step is an intuition—and comes with a burst, then difficulties arise. This thing gives out and then that—“Bugs”—as such little faults and difficulties are called.

—Thomas Edison

The human brain is the most complex device in the known universe, yet it is an imperfect one. And, ultimately, who we are as individuals and as a society is defined not only by the astonishing capabilities of the brain but also by its flaws and limitations. Consider that our memory can be unreliable and biased, which at best leads us to forget names and numbers, but at worse results in innocent people spending their lives in prison as a result of faulty eyewitness testimony. Consider our susceptibility to advertising, and that one of the most successful marketing campaigns in history contributed to an estimated 100 million deaths in the twentieth century; the tragic success of cigarette advertising reveals the degree to which our desires and habits can be shaped by marketing.
1
Our actions and decisions are influenced by a host of arbitrary and irrelevant factors, for example, the words used to pose a question can bias our answers, and voting locations can sway how we vote.
2
We often succumb to the lure of instant gratification at the expense of our long-term well-being, and our irrepressible tendency to engage in supernatural beliefs often leads us astray. Even our fears are only tenuously related to what we should fear.

The outcome of these facts is that what we presume to be rational decisions are often anything but. Simply put, our brain is inherently well suited for some tasks, but ill suited for others. Unfortunately, the brain’s weaknesses include recognizing which tasks are which, so for the most part we remain ignorantly blissful of the extent to which our lives are governed by the
brain’s bugs
.

The brain is an incomprehensibly complex biological computer, responsible for every action we have taken and every decision, thought, and feeling we’ve ever had. This is probably a concept that most people do not find comforting. Indeed, the fact that the mind emerges from the brain is something not all brains have come to accept. But our reticence to acknowledge that our humanity derives solely from the physical brain should not come as a surprise. The brain was not designed to understand itself anymore than a calculator was designed to surf the Web.

The brain
was
designed to acquire data from the external world through our sensory organs; to analyze, store, and process this information; and to generate outputs—actions and behaviors—that optimize our chances of survival and reproduction. But as with any other computational device the brain has bugs and limitations.

For convenience, rather than scientific rigor, I borrow the term
bugs
from the computer lexicon to refer to the full range of limitations, flaws, foibles, and biases of the human brain.
3
The consequences of computer bugs range from annoying glitches in screen graphics to the computer’s freezing or the “blue screen of death.” Occasionally computer bugs can have fatal consequences, as in cases where poorly written software has allowed lethal doses of radiation to be delivered to patients during cancer therapy. The consequences of the brain’s bugs can be equally wide ranging: from simple illusions, to annoying memory glitches, to irrational decisions whose effects can just as likely be innocuous as fatal.

If there is a bug in your favorite software program, or an important feature is absent, there is always hope that the issue will be remedied in the next version, but animals and humans have no such luxury; there are no instant-fix patches, updates, or upgrades when it comes to the brain. If it were possible, what would be on the top of your brain upgrade list? When one asks a classroom of undergraduate students this question, invariably the answer is to have a better memory for the names, numbers, and facts that they are bombarded with (although a significant contingent of students ingenuously opts for mind reading). We have all struggled, at some point, to come up with the name of somebody we know, and the phrase “You know…what’s his name?” may be among the most used in any language. But complaining that you have a bad memory for names or numbers is a bit like whining about your smartphone functioning poorly underwater. The fact of the matter is that your brain was simply not built to store unrelated bits of information, such as lists of names and numbers.

Think back to someone you met only once in your life—perhaps someone you sat next to on an airplane. If that person told you his name and profession, do you think you would be equally likely to remember both these pieces of information, or more likely to remember one over the other? In other words, are you an equal opportunity forgetter, or for some reason are you more likely to forget names than professions? A number of studies have answered this question by showing volunteers pictures of faces along with the surname and profession of each person depicted. When the same pictures were shown again during the test phase, subjects were more likely to remember people’s professions than their names. One might venture that this is because the professions were simpler to remember for some reason; perhaps they are more commonly used words—a factor known to facilitate recall. As a clever control however, some of the words were used either as names or professions; for instance, Baker/baker or Farmer/farmer could have been used as the name or the occupation. Still, people were much more likely to remember that someone was a
baker
than that he was
Mr. Baker
.
4

As another example of the quirks of human memory, read the following list of words:

candy, tooth, sour, sugar, good, taste, nice, soda, chocolate, heart, cake, honey, eat, pie

Now read them again and take a few moments to try to memorize them.

Which of the following words was on the list: tofu, sweet, syrup, pterodactyl?

Even if you were astute enough to realize that none of these four words was on the list, there is little doubt that
sweet
and
syrup
gave you more of a mental pause than
tofu
and
pterodactyl
.
5
The reason is obvious:
sweet
and
syrup
are related to most of the words on the list. Our propensity to confuse concepts that are closely associated with each other is not limited to sweets, but holds for names as well. People mistakenly call each other by the wrong name all the time. But the errors are not random; people have been known to call their current boyfriend or girlfriend by their ex’s name, and I suspect my mother is not the only harried parent to have inadvertently called one child by the other’s name (and my only sibling is a sister). We also confuse names that sound alike: during the 2008 presidential campaign more than one person, including a presidential candidate, accidently referred to Osama bin Laden as Barack Obama.
6
Why would it be harder to remember that the person you met on the plane is named Baker than that he is a baker? Why are we prone to confuse words and names that are closely associated with each other? We will see that the answer to both these questions is a direct consequence of the associative architecture of the human brain.

BUGS VERSUS FEATURES

Like a sundial and a wristwatch that share nothing except their purpose for being, digital computers and brains share little other than the fact that they are both information processing devices. Even when a digital computer and biological computer are working on the same problem, as occurs when a computer and a human play chess (generally to the consternation of the latter), the computations being performed have little in common. One performs a massive brute force analysis of millions of possible moves, while the other relies on its ability to recognize patterns to guide a deliberate analysis of a few dozen.

Digital computers and brains are adept at performing entirely different types of computations. Foremost among the brain’s computational strengths—and a notorious weakness of current computer technology—is pattern recognition. Our superiority in this regard is well illustrated by the nature of our interactions with digital computers. If you’ve been online in the last decade, at some point your computer probably politely asked you to transcribe some distorted letters or words shown in a box on the screen. The point of this exercise, in many ways, could not have been more profound: it was to ensure that you are a human being. More precisely, it’s making sure that you are not an automated “Web robot”—a computer program put to work by a human with the nefarious goal of sending spam, cracking into personal accounts, hoarding concert tickets, or carrying out a multitude of other wicked schemes. This simple test is called a CAPTCHA, for Completely Automated Public Turing test to tell Computers and Humans Apart.
7
The Turing test refers to a game devised by the cryptographer extraordinaire and a father of computer science, Alan Turing. In the 1940s, a time when a digital computer occupied an entire room and had less number-crunching power than a cappuccino machine today, Turing was not only pondering whether computers would be able to think, but also wondering about how we would know it if they did. He proposed a test, a simple game that involved a human interrogator carrying on a conversation with a hidden partner that was either another human or a computer. Turing argued that if a machine could successfully pass itself off as a human, it then would have achieved the ability to think.

Computers cannot yet think or even match our ability to recognize patterns—which is why CAPTCHAs remain an effective means to filter out the Web robots. Whether you are recognizing the voice of your grandmother on the phone, the face of a cousin you have not seen in a decade, or simply transcribing some warped letters on a computer screen, your brain represents the most advanced pattern recognition technology on the face of the earth. Computers, however, are rapidly gaining ground, so we may not hold this distinction for much longer. The next generation of CAPTCHAs will probably engage different facets of our pattern recognition skills, such as extracting meaning and three-dimensional perspective from photographs.
8

The brain’s ability to make sense of the “blooming, buzzing confusion” of information impinging on our sensory organs is impressive. A three-year-old understands that the word
nose
, in any voice, represents that thing on people’s faces that adults occasionally claim to steal. A child’s ability to comprehend speech exceeds that of current speech recognition software. While used for automated telephone services, speech recognition programs still struggle with unconstrained vocabularies from independent speakers. These programs generally trip up when presented with similarly sounding sentences such as: “I helped recognize speech” and “I helped wreck a nice beach.” In contrast, if there is a flaw in our pattern recognition ability, it may be that we are too good at it; with a bit of coaxing we see patterns where there are none—whether it be the mysterious apparition of the Virgin Mary on a water-stained church wall or our willingness to impose meaning on the inkblots of a Rorschach test.

Imagine for a moment that you had to develop a test with the reverse goal of that of a CAPTCHA: one that humans would fail, but a Web robot, android, replicator, or whatever your non-carbon-based computational device of choice may be, would pass. Such a test is, of course, depressingly simple to devise. It could consist of asking for the natural logarithm of the product of two random numbers, and if the answer was not forthcoming within a few milliseconds, the human will have been unmasked. There are a multitude of simple tests that could be devised to weed out the humans. By and large, these tests could revolve around a simple observation: while pattern recognition is something the human brain excels at, math is not. This was obvious to Alan Turing even in the 1940s. As he was pondering whether computers would be able think, he did not waste much time considering the converse question: would humans ever be able to manipulate numbers like a digital computer? He knew there was an inherent asymmetry; someday computers may be able to match the brain’s ability to think and feel, but the brain would never be able to match the numerical prowess of digital computers: “If the man were to try and pretend to be the machine he would clearly make a very poor showing. He would be given away at once by slowness and inaccuracy in arithmetic.”
9

 

Let’s do some mental addition:

What is one thousand plus forty?

Now add another thousand to that,

and thirty more,

plus one thousand,

plus twenty,

plus a thousand,

and finally an additional ten.

The majority of people arrive at 5000, as opposed to the correct answer of 4100. We are not particularly good at mentally keeping track of decimal places, and this particular sequence induces most people to carry over a 1 to the wrong decimal place.

Most of us can find a face in a crowd faster than we can come up with the answer to 8 × 7. The truth is—to put it bluntly—we suck at numerical calculations. It is paradoxical that virtually every human brain on the planet can master a language, yet struggles to mentally multiply 57 × 73. By virtually any objective measure, the latter task is astronomically easier. Of course, with practice we can, and do, improve our ability to perform mental calculations, but no amount of practice will ever allow the most gifted human to calculate natural logarithms with the same speed and ease that any adolescent can recognize the distorted letters in a CAPTCHA.

We are approximate animals, and numerical calculations are digital in nature: each integer corresponds to a discrete numerical quantity whether it’s 1 or 1729. The discrete nature of the progression of integers stands in contrast to, say, the nebulous transition between orange and red. In his book
The Number Sense
, the French neuroscientist Stanislas Dehaene stresses that although humans and animals have an inherent feeling of quantity (some animals can be trained to determine the number of objects in a scene), it is distinctly nondigital.
10
We can represent the numbers 42 and 43 with symbols, but we do not really have a sense of “forty-twoness” versus “forty-threeness” as we have of “catness” versus “dogness.”
11
We may have an inherent sense of the quantities one through three, but beyond that things get hazy—you may be able to tell at a glance whether Homer Simpson has two or three strands of hair, but you’ll probably have to count to find out whether he has four or five fingers.
12
Given the importance of numbers in the modern world—from keeping track of age, money, and baseball statistics—it may come as a surprise that some hunter-gatherer languages do not seem to have words for numbers larger than 2. In these “one-two-many” languages, quantities larger than 2 simply fall into the “many” category. Evolutionarily speaking there was surely more pressure to recognize patterns than to keep track of and manipulate numbers. It’s more important to recognize at a glance that there are some snakes on the ground than determining how many there are—here, of course, the “one-two-many” system works just fine, as one potentially poisonous snake is one too many.

Other books

Doctors by Erich Segal
Moving Pictures by Schulberg
The Last Dreamer by Barbara Solomon Josselsohn
All or Nothing by S Michaels
The Chancellor Manuscript by Robert Ludlum
Wielder's Fate by T.B. Christensen