The End of Absence: Reclaiming What We've Lost in a World of Constant Connection (19 page)

BOOK: The End of Absence: Reclaiming What We've Lost in a World of Constant Connection
9.71Mb size Format: txt, pdf, ePub

• • • • •

 

The more we learn about human memory, the less it looks like a computer’s. As Henry Molaison’s experience first showed us, our memories are not straightforward “recall” systems, but strange and morphing webs.
17
Take, for example, your memory of this word:

Inglenook

 

An “inglenook” is a cozy corner by the fire. It calls up a pleasant scenario, and the word itself is one of the more beautiful words in the English language, which is why I’ve selected it as something we may want to have stored in our heads. How might the brain accomplish this?

Assuming you are looking at this text (and not listening to it), the first step will be light bouncing off the paper or tablet that you’re holding and traveling through your optic nerves, out the back of your eyeballs, and onto the primary visual cortex at the rear of your head. There, individual neurons will fire (like dots of color in a pointillist painting or perhaps a Lite-Brite toy) to correspond with the specific look of the word: “Inglenook.” Many neurons, firing together, create a composite image of the word. This sensory information (the composite image) then travels through a series of cortical regions toward the frontal part of the brain and from there to the hippocampus, which integrates that image and various other sensory inputs into a single idea: “Inglenook.” The original firing of neurons associated with “Inglenook” may result in a moment of fluttering understanding in your consciousness, an idea that, in itself, lives for only a matter of seconds—this is the
now
of thought, the working memory that does our active thinking. But that brief firing seems to leave behind a chemical change, which has been termed “long-term potentiation.” The neurons that have recently fired remain primed to fire again for a matter of minutes. So if a writer decides to fire your “Inglenook” neurons six times in a row (as I now have), the neurons your brain has associated with that word will have become more and more likely to produce real synaptic growth—in other words, you might remember it for longer than it takes you to read this page. (Literal repetition isn’t necessary for memories to be formed, of course; if a singular event is important enough, you’ll rehearse it to yourself several times and burn it into your mind that way.)

If your hippocampus, along with other parts of your frontal cortex, decides that “Inglenook” is worth holding on to (and I hope it will), then the word and its meaning will become part of your long-term memory. But the various components of “Inglenook” (its sound, its look, and all the associations you have already made with the experience of reading about “Inglenook”) will be stored in a complex series of systems around your brain, not in a single folder.

Next week you may find yourself with an accidental time snack, waiting for the kettle to boil, and in that moment, perhaps the word
Inglenook
will float back into your consciousness (because you’ll be thinking about a cozy place by the fire in which to enjoy that tea). But when it does so, the sound—“Inglenook”—will come from one part of your brain, while the look of the word—“Inglenook”—will float in from another; the way you feel about this book will be recalled from some other region; and so on. These various scraps of information will be reassembled—by what means, we know not—to create the complete idea of “Inglenook.” And (with so many moving parts, it’s inevitable) each time you reconstruct “Inglenook,” its meaning will have altered slightly; something will be added, something taken away.
18
Our memories, as the psychologist Charles Fernyhough recently wrote in
Time
magazine, “
are created in the present
, rather than being faithful records of the past.” Or as one of the world’s leading memory experts, Eric Kandel, has put it: “
Every time you recall a memory
, it becomes sensitive to disruption. Often that is used to incorporate new information into it.”

The same notion came up again when I had a conversation with Nelson Cowan, Curators’ Professor of Psychology at the University of Missouri, and a specialist in memory, who quoted Jorge Luis Borges for me:

Memory changes things
. Every time we remember something, after the first time, we’re not remembering the event, but the first memory of the event. Then the experience of the second memory and so on.

 

“He got that basically right,” Cowan told me. “There’s a process called ‘reconsolidation,’ whereby every retrieval of memory involves thinking about it in a new way. We edit the past in light of what we know now. But we remain utterly unaware that we’ve changed it.”

Memory is a lived, morphing experience, then, not some static file from which we withdraw the same data time and again. Static memories are the domain of computers and phone books, which, says Cowan, “really bear no similarity to the kind of memory that humans have.” He seemed provoked by the comparison, in fact, and this struck me because so many other academics I’d spoken with had happily called their computers “my off-loaded memory,” without considering in the moment how very different the two systems are. Perhaps we’re keen to associate ourselves with computer memories because the computer’s genius is so evident to us, while the genius of our own brain’s construction remains so shrouded. I complained to Cowan that current descriptions of human memory—all those electrical impulses traveling about, “creating” impressions—hardly explain what’s actually happening in my head. And he said only, “You’d be surprised how little we know.”

What we do know is that human memory appears to be a deeply inventive act. Every time you encounter the word
Inglenook
from now on, you may think that you recall this moment. But you will not.

• • • • •

 

Charlie Kaufman’s film
Eternal Sunshine of the Spotless Mind—
a fantasy in which heartbroken lovers erase each other from their memories—was based on very real research by McGill University’s star neuroscientist Karim Nader, whose work on the nature of “reconsolidation” has shown us how dramatically vulnerable our memories become each time we call them up. As far back as 2000 (four years before the
Sunshine
film came out), Nader was able to show that reactivated fear-based memories (i.e., memories we’re actively thinking about) can be altered and even blocked from being “re-stored” in our memory banks with the introduction of certain protein synthesis inhibitors, which disrupt the process of memory consolidation. In other words, it’s the content that’s pulled into our working memory (the material we actively are ruminating on) that’s dynamic and changeable. Today, this understanding grounds our treatment of post-traumatic stress victims (rape survivors, war veterans); like the lovers in
Sunshine,
victims of trauma have the chance to rewire their brains.

I wonder if such measures may become more and more appealing as the high fidelity of computers keeps us from forgetting that which our minds might have otherwise dropped into the abyss. Steve Whittaker, a psychology professor at the University of California, Santa Cruz, has written on the problem of forgetting in a digital age. Interviews with
Sunshine
-esque lonely hearts convinced him that the omnipresent digital residue of today’s relationships—a forgotten e-mail from three years ago, a tagged Facebook photo on someone else’s wall—could make the standard “putting her out of your mind” quite impossible.
In a 2013 paper
(coauthored with Corina Sas of Lancaster University), he proposes a piece of “Pandora’s box” software that would automatically scoop up all digital records of a relationship and wipe them from the tablet of human and computer memories both.
19
Again, we find ourselves so enmeshed that we must lean on more technology to aid us through a technologically derived problem.

• • • • •

 

When I was twenty-one—a third-year English major—I was asked for the first time to accomplish the stultifying job of memorizing a stretch of poetry. To my parents, it was shocking that I’d made it that far without learning by heart a few lines of Shakespeare or Browning. (My mother can still, at sixty-six, recite Thomas Hardy’s “The Darkling Thrush.”) But my peers and I—the first to use calculators in class, the first to think digitally—never needed to bother.

That changed when I took Dr. Danielson’s seminar on
Paradise Lost
. On day one, and much to our horror, we were tasked with learning by heart the first twenty-six lines of Milton’s epic. Each week, Dr. Danielson had us stand and recite the labyrinthine lines en masse. He told us, “This will give you something to run over in your head when you’re standing at a bus stop. You’ll always have a poem.” A dozen years later, my remembrance of those lines is spotty, sure, and cuts off after line sixteen, but it’s more clear than any other piece of literature I read at school, or since. Here’s what’s still coded in the synapses that know to fire:

Of man’s first disobedience and the fruit

Of that Forbidden Tree, Whose mortal taste

Brought Death into the world, and all our Woe,

. . . [something] . . . Till one greater Man

Restore us, and regain the blissful Seat,

Sing heavenly muse, that on the secret top

Of Oreb, or of Sinai, didst inspire

. . . [something] . . . who first taught the chosen Seed,

In the Beginning how the Heavens and Earth

Rose out of Chaos. Or, if Sion Hill

Delight thee more, and Siloa’s Brook that flowed

Fast by the Oracle of God, I thence

Invoke thy aid to my adventurous Song,

That with no middle flight intends to soar

Above the Aonian Mount, while it pursues

Things unattempted yet in Prose or Rhyme.

[something something . . . ]

 

This patchwork of poetry is embarrassing, is meager, but it is mine. In less time than it took to tap out those lines, I might have called up the entire ten-thousand-line epic on my laptop—and without the errors. So why do I nonetheless love having it there, broken and resting in the attic of my head?

I wanted to know why my old professor had placed it in my brain in the first place. So I tracked down Dr. Danielson and invited him out for coffee. I recognized him immediately when he entered the café, but he walked up to a different man, a younger one, thinking it was I. (Teachers will usually remember only the first couple of classes they teach with any clarity; the rest of us blur into a mass of personalities, no matter how distinctive we may feel and no matter how large an impact that teacher may have had on us.)

Dr. Danielson has been teaching
Paradise Lost
for more than thirty years now, and every class has been made to memorize the epic’s opening. I am only one of hundreds, then, walking around with snatches of Milton because of Dr. Danielson. I asked him whether he’d seen a difference in students’ reactions to the task over the course of the Internet’s advent.

“If only I’d taken notes on that. . . .” He smiled. “I do think my students today are just as capable of memorizing those lines, but the difference is that they feel they’re less capable of memorizing now. It doesn’t occur to them that they’re able to do something like that, in the same way that a person who’s never trained for a marathon can’t imagine running one. But every year I get the same comments from students at the end of the term—they’ll say they didn’t want to do the memory work and that they are so glad they were forced to do it. They will tell me that the memorization was the most empowering part of the course. This is never done anymore, I suppose. It’s become very typical that, like you, a student will never be asked to memorize poetry.”

“What is it that you think it does?”

“It’s this idea of ‘formation,’” he began. “Memorizing something literally
informs
your mind. It creates neural pathways, yes? You literally internalize it, download something into your brain. You are programming yourself.”

It’s telling, I think, that when justifying the exercise of the human mind, we so often resort to computer terms such as “download” and “program.” I asked him about the moral behind such programming; if we’re programming our minds, then the question becomes
with what,
after all. And here Danielson turned away from technological metaphors and toward a religious one that he learned from an old pastor:

There’s a slightly corny saying that has a lot to it: Sow a thought, reap an action; sow an action, reap a habit; sow a habit, reap a character; sow a character, reap a destiny. And I believe that memorizing something is the sowing of a thought.

 

Here my professor’s conversation seemed to be marrying a certain moralism to the work of neuroplasticity researchers (you are what you do). He told me memorization was the act of making something “a property of yourself,” and he meant this in both senses: The memorized content is owned by the memorizer and also becomes a component of that person’s makeup. “It makes it part of my lived experience in an ongoing way.”

Other books

Billy and Me by Giovanna Fletcher
Penny and Peter by Carolyn Haywood
A Promise to Love by Serena B. Miller
Whip Smart: A Memoir by Melissa Febos
Princess by Jean P. Sasson
Broken Homes (PC Peter Grant) by Aaronovitch, Ben
My Brother's Keeper by Adrienne Wilder