Read Everything Bad Is Good for You Online
Authors: Steven Johnson
I think the regime of competence principle operates on another scale as well: not in the forty hours it takes to complete your average video game, but on the hundred-year scale of electric speed. When cinema first became a mainstream diversion in the early 1900s, the minds of that era were not primed to master ten new technologies and dozens of new genres in the next decade; they had to adapt to the new conventions of moviegoing, learning a new visual language, and a new kind of narrative engine. But as the new technologies started to roll out in shorter and shorter cycles, we grew more comfortable with the process of probing a new form of media, learning its idiosyncrasies and its distortions, its symbolic architecture and its rules of engagement. The mind adapts to adaptation. Eventually you get a generation that welcomes the challenge of new technologies, that embraces new genres with a flexibility that would have astonished the semi-panicked audiences that trembled through the first black-and-white films.
Technology manufacturers have an economic incentive to obey the regime of competence principle as well: if your new platformâan operating system, say, or a wireless communicator, or TiVo-style personal video recorderâis too familiar, it will seem like old news to potential consumers; but if you push too far past the regime of competence, you'll lose your audience as well. Release new technologies that challenge the mind without overtaxing it, and release them in shorter and shorter cycles, and the line that tracks our abilities to probe and master complex systems will steadily ascend, turning upward in a parabolic climb as the cycles of electric speed increase.
Project that data over a hundred years, and you will have a chart that looks remarkably like the Flynn Effect.
Â
P
OP CULTURE'S
race to the top over the past decades forces us to rethink our assumptions about the base tendencies of mass society: the
Brave New World
scenario, where we're fed a series of stupefying narcotics by media conglomerates interested solely in their lavish profits with no concern for the mental improvement of their consumers. As we've seen, the Sleeper Curve isn't the result of media titans doing charitable work; there's an economic incentive in producing more challenging culture, thanks to the technologies of repetition and meta-commentary. But the end result is the same: left to its own devices, following its own profit motives, the media ecosystem has been churning out popular culture that has grown steadily more complex over time. Imagine a version of
Brave New World
where
soma
and the
feelies
make you smarter, and you get the idea.
If the Sleeper Curve turns the conventional wisdom about mass culture on its head, it does something comparable to our own headsâand the truisms we like to spread about them. Almost every Chicken Little story about the declining standards of pop culture contains a buried blame-the-victim message: Junk culture thrives because people are naturally drawn to simple, childish pleasures. Children zone out in front of their TV shows or their video games because the mind seeks out mindlessness. This is the Slacker theory of brain function: the human brain desires above all else that the external world refrain from making it do too much work. Given their druthers, our brains would prefer to luxuriate among idle fantasies and mild amusements. And so, never being one to refuse a base appetite, the culture industry obliges. The result is a society where maturity, in Andrew Solomon's words, is a “process of mental atrophy.”
These are common enough sentiments, but they contain a bizarre set of assumptions if you think about them from a distance. Set aside for the time being the historical question of why IQs are climbing at an accelerating rate while half the population wastes away in mental atrophy. Start instead with the more basic question of why our brains would actively seek out atrophy in the first place.
The
Brave New World
critics like to talk a big game about the evils of media conglomerates, but their worldview also contains a strikingly pessimistic vision of the human mind. I think that dark assumption about our innate cravings for junk culture has it exactly backward. We know from neuroscience that the brain has dedicated systems that respond toâand seek outânew challenges and experiences. We are a problem-solving species, and when we confront situations where information needs to be filled in, or where a puzzle needs to be untangled, our minds compulsively ruminate on the problem until we've figured it out. When we encounter novel circumstances, when our environment changes in a surprising way, our brains lock in on the change and try to put it in context or decipher its underlying logic.
Parents can sometimes be appalled at the hypnotic effect that television has on toddlers; they see their otherwise vibrant and active children gazing silently, mouth agape at the screen, and they assume the worst: the television is turning their child into a zombie. The same feeling arrives a few years later, when they see their grade-schoolers navigating through a video game world, oblivious to the reality that surrounds them. But these expressions are not signs of mental atrophy. They're signs of
focus.
The toddler's brain is constantly scouring the world for novel stimuli, precisely because exploring and understanding new things and experiences is what learning is all about. In a house where most of the objects haven't moved since yesterday, and no new people have appeared on the scene, the puppet show on the television screen is the most surprising thing in the child's environment, the stimuli most in need of scrutiny and explanation. And so the child locks in. If you suddenly plunked down a real puppet show in the middle of the living room, no doubt the child would prefer to make sense of that. But in most ordinary household environments, the stimuli onscreen offer the most diversity and surprise. The child's brain locks into those images for good reason.
Think about it this way: if our brain really desired to atrophy in front of mindless entertainment, then the story of the last thirty years of video gamesâfrom
Pong
to
The Simsâ
would be a story of games that grew increasingly simple over time. You'd never need a guidebook or a walk-through; you'd just fly through the world, a demigod untroubled by challenge and complexity. Game designers would furiously compete to come out with the simplest titles; every virtual space would usher you to the path of least resistance. Of course, exactly the opposite has occurred. The games have gotten more challenging at an astounding rate: from
PacMan'
s single page of patterns to
Grand Theft Auto III
's 53,000-word walk-through in a mere two decades. The games are growing more challenging because there's an economic incentive to make them more challengingâand that economic incentive exists because our brains
like
to be challenged.
If our mental appetites draw us toward more complexity and not less, why do so many studies show that we're reading fewer books than we used to? Even if we accept the premise that television and games can offer genuine cognitive challenges, surely we have to admit that books challenge different, but equally important, faculties of the mind. And yet we're drifting away from the printed page at a steady rate. Isn't that a sign of our brains gravitating to lesser forms?
I believe the answer is no, for two related reasons. First, most studies of reading ignore the huge explosion of reading (not to mention writing) that has happened thanks to the rise of the Internet. Millions of people spend much of their day staring at words on a screen: browsing the Web, reading e-mail, chatting with friends, posting a new entry to one of those 8 million blogs. E-mail conversations or Web-based analyses of
The Apprentice
are not the same as literary novels, of course, but they are equally text-driven. While they suffer from a lack of narrative depth compared to novels, many online interactions do have the benefit of being genuinely two-way conversations: you're putting words together yourself, and not just digesting someone else's. Part of the compensation for reading less is the fact that we're writing more.
The fact that we are spending so much time online gets to the other, more crucial, objection: yes, we're spending less time reading literary fiction, but that's because we're spending less time doing
everything
we used to do before. In fact, the downward trend that strikes the most fear in the hearts of Madison Avenue and their clients is not the decline of literary readingâit's the decline of television watching. The most highly sought demographic in the countryâtwenty-something malesâwatches almost one-fifth less television than they did only five years ago. We're buying fewer CDs; we're going out to the movies less regularly. We're doing all these old activities less because about a dozen new activities have become bona fide mainstream pursuits in the past ten years: the Web, e-mail, games, DVDs, cable on-demand, text chat. We're reading less because there are only so many hours in the day, and we have all these new options to digest and explore. If reading were the only cultural pursuit to show declining numbers, there might be cause for alarm. But that decline is shared by all the old media forms across the board. As long as reading books remains
part
of our cultural diet, and as long as the new popular forms continue to offer their own cognitive rewards, we're not likely to descend into a culture of mental atrophy anytime soon.
Â
N
OW
for the bad news. The story of the last thirty years of popular culture is the story of rising complexity and increased cognitive demands, an ascent that runs nicely parallel toâand may well explainâthe upward track of our IQ scores. But there are hidden costs to the Sleeper Curve. It's crucial that we abandon the
Brave New World
scenario where mindless amusement always wins out over more challenging fare, that we do away once and for all with George Will's vision of an “increasingly infantilized society.” Pop culture is not a race to the bottom, and it's high time we acceptedâeven celebratedâthat fact. But even the most salutary social development comes with peripheral effects that are less desirable.
The rise of the Internet has forestalled the death of the typographic universeâand its replacement by the society of the imageâpredicted by McLuhan and Postman. Thanks to e-mail and the Web, we're reading text as much as ever, and we're writing more. But it is true that a specific, historically crucial kind of reading has grown less common in this society: sitting down with a three-hundred-page book and following its argument or narrative without a great deal of distraction. We deal with text now in shorter bursts, following links across the Web, or sifting through a dozen e-mail messages. The breadth of information is wider in this world, and it is far more participatory. But there are certain types of experiences that cannot be readily conveyed in this more connective, abbreviated form. Complicated, sequential works of persuasion, where each premise builds on the previous one, and where an idea can take an entire chapter to develop, are not well suited to life on the computer screen. (Much less life on
The O'Reilly Factor.
) I can't imagine getting along without e-mail, and I derive great intellectual nourishment from posting to my weblog, but I would never attempt to convey the argument of this book in either of those forms. Postman gets it right:
To engage the written word means to follow a line of thought, which requires considerable powers of classifying, inference-making and reasoningâ¦. In the eighteenth and nineteenth centuries, print put forward a definition of intelligence that gave priority to the objective, rational use of the mind and at the same time encouraged forms of public discourse with series, logically ordered content. It is no accident that the Age of Reason was coexistent with the growth of a print culture, first in Europe and then in America.
Networked text has its own intellectual riches, of course: riffs, annotations, conversationsâthey all flourish in that ecosystem, and they all can be dazzlingly intelligent. But they nonetheless possess a
different kind
of intelligence from the intelligence delivered by reading a sustained argument for two hundred pages. You can convey attitudes and connections in the online world with ease; you can brainstorm with twenty strangers in a way that would have been unthinkable just ten years ago. But it is harder to transmit a fully fledged worldview. When you visit someone's weblog, you get a wonderfulâand sometimes wonderfully intimateâsense of their voice. But when you immerse yourself in a book, you get a different sort of experience: you enter the author's mind, and peer out at the world through their eyes.
Something comparable happens in reading fiction as well. No cultural form in history has rivaled the novel's capacity to re-create the mental landscape of another consciousness, to project you into the first-person experience of other human beings. Movies and theater can make you feel as though you're part of the action, but the novel gives you an inner vista that is unparalleled: you are granted access not just to the events of another human's life, but to the precise way those events settle in his or her consciousness. (This is most true of the modernist classics: James, Eliot, Woolf, Conrad.) Reading
Portrait of a Ladyâ
once you've shed your MTV-era expectations about pacing and oriented yourself to James's byzantine syntaxâyou experience another person thinking and sensing with a clarity that can be almost uncanny. But that cognitive immersion requires a physical immersion for the effect to work: you have to commit to the book, spend long periods of time devoted to it. If you read only in short bites, the effect fades, like a moving image dissolving into a sequence of frozen pictures.