The Winter of Our Disconnect (20 page)

BOOK: The Winter of Our Disconnect
4.83Mb size Format: txt, pdf, ePub
Before we began The Experiment, my own (admittedly equally unscientific) hunch was at odds with Bill’s and Tapscott’s. The idea that constant track-jumping could conduce to a smooth train of thought for anybody, under any circumstances, seemed like a stretch.
Post-Experiment, I know it is. I watched as my kids awoke slowly from the state of cognitus interruptus that had characterized many of their waking hours, to become more focused, logical thinkers. I watched as their attention spans sputtered and took off, allowing them to read for hours—not minutes—at a stretch; to practice their instruments intensively; to hold longer and more complex conversations with adults and among themselves; to improve their capacity to think beyond the present moment, even if that only translated into remembering to wash out tights for tomorrow morning.
I’m not saying anybody suddenly went from Hannah Montana to Homer. They didn’t develop an unquenchable thirst for their set texts, or learn to love their trigonometry worksheets. In fact, they probably did no more homework during The Experiment than they had done before—Sussy swears she did much less and, though her grades improved significantly, this may be true. But they all completed their schoolwork far more efficiently, far more quickly, and with visibly greater focus.
I can’t say what went on in the “actual wiring” inside their heads once they were forcibly encouraged to monotask. Luckily, I don’t have to. Any number of major studies has shown that all that spin about “different brains” has been greatly exaggerated, like Mark Twain’s death or the redemptive power of bleached teeth. In fact, it turns out that multitaskers are not ahead of the cognitive curve at all—not even in those skill areas where one would expect otherwise. It’s true that our kids’ brains are being changed by the media they habitually interact with, and that many of those changes are as yet dimly understood. It’s also true that bookish people like me, who need cocoonlike isolation in order to work effectively, have our own wiring issues. But the fact is no one’s brain is different enough to make constant interruptions, distractions, and task-switching an optimal environment in which to function.
No one’s
.
 
 
As a fifty-two-year-old, post-reproductive female, my brain is “different” too. Well, hello. Half the time I can’t even remember where I left my last partner, let alone my reading glasses. As we all know, the prevailing cultural mythology for people my age—especially women my age—is all about memory loss, vagueness, and diminution of brain function. At least, I’m pretty sure it is . . . Hang on. Isn’t it?
LOLishly enough, the latest neuroscience suggests that people at my stage of the game have particularly agile neural ability. (Think Barack Obama, not
Menopause: The Musical
.) True, we are somewhat slower at acquiring new information. But our ability to process, organize, and contextualize that information is unparalleled—and it shows in our “wiring”—aka our neural structures. Midlife brains are marked by a proliferation of glial cells (that’s Greek for “glue”) and experience optimal convergence between right and left hemispheres. The cumulative effect, notes one neuroscientist, is that “our brains graduate from a dial-up modem pace to high-speed DSL.” No wonder we occasionally exceed our personal download allowance!
The superior cognitive function we experience at midlife is one reason people tend not to vote for world leaders who are in their twenties and thirties. It also explains why younger air traffic controllers are consistently outperformed by their more experienced elders. A recent study by researchers at MIT and the University of Illinois found that middle-aged workers’ reaction time, memory, and attentional ability was significantly worse than that of younger colleagues, when both groups were tested in isolation and under laboratory conditions. But when they were tested in real-life conditions, the elderly tortoises absolutely hammered the upstart hares.
3
You and I might call the midlife advantage “wisdom.” Neuroscientists locate it within actual structures in the brain. It all adds up to the same thing: When it comes to sorting through and weighing up multiple bits of information, midlife heads do it better and faster.
While we’re on the subject of good news for mother, midlife brains are also more efficient when it comes to controlling temperament. Contrary to prevailing stereotypes, males and females alike become
less
grumpy with age. We also tend to grow less impulsive, less labile in our moods, and less prone to extreme emotional responses. A study carried out at the University of California, Berkeley, assessed 123 women in their early twenties and again in subsequent decades. It found “likable personality traits”—such as the ability to remain objective, to tolerate ambiguity, to handle interpersonal relationships successfully—peaked for women in their fifties and sixties.
4

Admittedly, on the day of the e. e. cummings assignment, you could be forgiven for thinking otherwise. It was one of those days where something definitely clicked in my head. It was somewhere between an
aha!
moment and a
WTF?!
moment.
That night, after I’d kissed everybody goodnight and switched their screens to sleep mode, I found myself thinking back to my own high school days. I’d usually do my homework in my bedroom, away from the depressingly familiar dialogue on my mother’s afternoon soap opera. Other kids tore handwritten pages from their spiral-bound notebooks, but not me. Those untidy curly edges always set my teeth on edge. I preferred to do my assignments double-spaced neatly on onionskin, on my beloved orange Olympia portable. (Once a nerd, always a nerd.) Yet there was nothing particularly Dickensian about my bedroom. I had a telephone. It was white and had a bleached-blond troll doll glued to the receiver.
I also had a radio and a stereo, and in my senior year even a portable color TV. But using any of them while I worked would have been as unthinkable as singing karaoke in the middle of a school assembly. And there
was
no karaoke back then.
I did get bored sometimes. But most of my distraction strategies seemed to revolve around fire: lighting a stick of incense or a scented candle—hey, it
was
the seventies, okay?—or, in extremis, crawling through my bedroom window onto the roof to smoke. I also had a weird but obscurely satisfying habit of melting crayons on the light bulb of my desk lamp. That was the closest I ever came to something resembling multitasking. I listened to a ton of music, like all normal teenagers. But when I listened, I listened. Hard, and usually studying the lyrics on the back cover of the LP.
Other kids I knew were pretty much the same. Some listened to the radio while they studied—something our parents and teachers frowned upon, it amuses me to remember—but that was about as “stimulating” as our media environment ever got.
These are anecdotal recollections, I realize. Yet the fact that there exist no hard data from this period on teen media use is evidence of how much has changed. Today, entire journals are devoted to the subject, and new articles and books appear as regularly as reruns of
Seinfeld
. We are so much more interested in how our kids interact with technology. Partly that’s because there is so much more technology. Partly it’s because there’s so much more fear. Thirty-five years ago, we didn’t know enough to know how much we didn’t know. Today we are beginning to.
You don’t need a Ph.D. in social psychology to tell you something’s up when you have to fight to make eye contact with your teenage children, or get them to sit down and eat a meal, or have the occasional grunt-free conversation. As parents, we comfort ourselves with the excuse that all this is normal, natural, age-appropriate stuff. But somewhere in the back of our beleaguered, Boomer-ish brains, we remember a time—perhaps even our own teen years—when it wasn’t. For many of us, that is exactly the opposite of a comforting thought. It’s a scary one. And it’s exactly that fear factor that creates such fertile ground for the growth of noxious “experts,” whether they are the cheerleaders, who insist we are moving toward a user-generated golden age waaay too cool for parents or other vestigial organisms to understand, or the doomsayers, who prophesy with equal confidence the collapse of civilization as we know it.
Among the former are authors Don Tapscott—the aforementioned “different brain” dude—and Steven Johnson, whose irresistibly titled
Everything Bad Is Good for You
argues that the new media only
seem
to be dumbing us down; in fact, they are making us smarter. True, our kids know fewer facts and less history, these authors acknowledge. They struggle to construct arguments and to maintain focus. But their ability as information hunter-gathers, their visual acuity, their narrative and creative intelligence, leave an older generation for brain-dead. This is exactly the message our kids want us to hear. It’s also the one they themselves quite earnestly believe. To be fair, there is some compelling evidence to support the cheerleaders’ case—including the fact that IQ (as opposed to SAT scores, say) has been rising for decades.
In the opposite corner sit observers like Emory literature professor Mark Bauerlein, author of
The Dumbest Generation
(no prizes for guessing how
he
feels about multitasking); journalist Maggie Jackson, whose meticulously researched book
Distracted
argues that, in the age of the Internet, we
all
have ADHD; and clinical psychologist Michael Osit, whose 2008 book
Generation Text
thunders against a generation used to “instant everything.” The case the doomsayers argue is persuasive, fact-filled—and deeply depressing. Literacy as we know it is vanishing. Attention spans are anorexic. Narcissism is up—knowledge is down. The culture is coarsening and so is our cognitive edge.
The cheerleaders tell only the rah-rah side of the story. They fall over themselves in their eagerness to seize the new day. The doomsayers’ gaze, by contrast, is fixed determinedly in the rearview mirror. They see only a rapidly retreating landscape and devote most of their considerable rhetorical power to mourning its passing.
The question of which side to believe is so pre-Web 2.0. As critic Neil Postman was fond of remarking, “Information explosions blow things up.” That doesn’t mean they blow
everything
up. But some stuff, yes, inevitably—sometimes some pretty important stuff. The fireworks are amazing—but we pay a price for admission. Media give, and media take away.
History shows us that, with the turning of each new technological tide, there is always somebody who’ll forecast tsunami. Socrates was one of them. He feared that the written word—basically the Twitter of fourth-century Athens—would undermine education, and he warned that reading would cause people to “cease to exercise their memory and become forgetful.” (Yup. The exact same argument you and I are
still
making about the use of calculators in school.) Having too many facts at one’s fingertips “without proper instruction” was dangerous too, leading people to be “filled with the conceit of wisdom instead of real wisdom.”
5
(The exact same argument you and I are still making about Google.)
Fifteenth-century Venetian man of letters Hieronimo Squarciafico thought the
printing press
was the devil. “Already abundance of books makes men less studious,” he fumed. “It destroys memory and enfeebles the mind by relieving it of too much work.”
6
A German critic, writing at the dawn of the reading revolution that would sweep Europe and the New World in the early nineteenth century, prophesied a pandemic of “colds, headaches, weakening of the eyes, heat rashes, gout, and arthritis.”
7
In
Everything Bad Is Good for You
, Steven Johnson cranks the culture jam even louder, imagining what today’s conservative critics—the ones who are convinced that Wikipedia is the devil’s workbench—might have said about printed books. That they are “tragically isolating.” That they understimulate the senses. That they suppress social interaction and breed intellectual passivity. (Imagine! You simply sit back and have the story dictated to you.) All true, of course . . . and all conveniently overlooked by Digital Immigrants such as you and me.
Most of us don’t think of books as “media” at all—which is both ridiculous and a reminder of how utterly embedded in our media ecology they have become. It’s sobering to realize that Socrates’ version of The Experiment would have been a six-month ban on reading or writing. Random! It occurs to me that I should really try to do without for a day or two, out of a sense of fair play to the Natives. The prospect frankly terrifies me. What the hell would I
do
? How would I
keep up
? (Wait a minute. Could this be the way Anni, Bill, and Sussy felt about relinquishing their Facebook accounts?) The concept of having dependency issues around literacy had never occurred to me, any more than the concept of having dependency issues around oxygen.
 
 
I’d made the appointment to speak to Bill’s Year 11 adviser almost impulsively. It was toward the end of the second term (and The Experiment). He was doing all right—overlooking the 27 percent on his last pre-calc exam—and he was happy enough with his teachers. But I still had concerns. I wasn’t sure his subject choices were the right ones, I explained to his adviser.
“Overall, Bill’s results have been excellent,” she reassured me. “His two maths units, his English, his human biology, his phys ed studies . . .”
“Yes,” I interrupted. “I know all that. It’s just . . . I’m afraid he’s going to graduate next year without . . . well,
knowing
anything.”
She looked at me quizzically. “Go on.”
“He hasn’t taken any geography or history since Year Nine. He doesn’t study a language. He doesn’t read politics, or law, or literature, or art history, or a social science of any kind. In English, so-called, they watch movies . . . And half the time they don’t even write assignments. They make PowerPoints, as a group project . . .” I trailed off. I was remembering the time Sussy’s teacher penalized her for using “too much language” in a slideshow presentation. If too much power corrupts, I’d reflected at the time, too much PowerPoint corrupts absolutely.

Other books

Secret Army by Robert Muchamore
Courtesan's Lover by Gabrielle Kimm
Fuego mágico by Ed Greenwood
Tiger Lillie by Lisa Samson
27: Robert Johnson by Salewicz, Chris
Dying to Forget by Trish Marie Dawson
Really Weird Removals.com by Daniela Sacerdoti
Mortal Engines by Stanislaw Lem
Passionate by Anthea Lawson