Present Shock: When Everything Happens Now (17 page)

BOOK: Present Shock: When Everything Happens Now
7.37Mb size Format: txt, pdf, ePub

Extracting digital processes from our organic flow not only creates the space for opportune timing to occur, it also helps prevent us from making inopportune gaffes. Like the famous television sketch where Lucy frantically boxes chocolates on the assembly line, we attempt to answer messages and perform tasks at the rate they come at us. And like Lucy, we end up jamming the wrong things in the wrong places. Gmail gives us a few minutes to recall messages we may have sent in error, but this doesn’t stop us from replying hastily to messages that deserve our time, and creating more noise than signal.

Comments sections are filled with responses from people who type faster than they think and who post something simply because they know they will probably never have time to find the discussion again. The possibility that someone might actually link to or re-Tweet a comment leads people to make still more, turning posting into just as much a compulsion as opening email messages. As our respected leaders post their most inane random thoughts to our Twitter streams, we begin to wonder why they have nothing better to do with their time, or with ours. When an actor with the pop culture simplicity of Ashton Kutcher begins to realize that his unthinking posts can reflect negatively on his public image,
32
it should give the rest of us pause. But we would need to inhabit
kairos
for at least a moment or two in order to do that.

The result is a mess in which we race to make our world conform to the forced yes-or-no choices of the digiphrenic.

I once received an email from a college student in Tennessee who had been recruited by a political group to protest against leftist professors. Since I was scheduled to speak at the university in a month, she had studied my website for over ten minutes, trying to figure out if I was a leftist. After perusing several of my articles, she was still unable to determine exactly which side of the political spectrum I was on. Could I just make this easier for her and tell her whether or not I was a leftist, so that she knew whether to protest my upcoming speech? I pointed her to some of my writing on economics and explained that the Left/Right categorization may be a bit overdetermined in my case. She thanked me but asked me to please just give her a yes-or-no answer.

“Yes and no,” I replied, breaking the binary conventions of digital choice making. I didn’t hear back.

DO DRONE PILOTS DREAM OF ELECTRIC KILLS?

I was working on a story about Predator drones, the unmanned aircraft that the US Air Force flies over war zones in the Middle East and Central Asia collecting reconnaissance, taking out targets, and occasionally launching a few Hellfire missiles. The operators—pilots, as they’re called—sit behind computer terminals outfitted with joysticks, monitors, interactive maps, and other cockpit gear, remotely controlling drones on the other side of the world.

I was most interested in the ethical challenges posed by remote control warfare. Air Force commanders told me that within just a few years a majority if not all American fighter planes would be flown remotely by pilots who were thousands of miles away from the war zone. The simulation of flight, the resolution of the cameras, the instantaneousness of feedback, and the accuracy of the controls have rendered the pilot working through virtual reality just as effective as if he or she were in the cockpit. So why risk human troops along with the hardware? There’s no good reason, other than the nagging sense that on some level it’s not fair. What does it mean to fight a war where only one side’s troops are in jeopardy, and the other side may as well be playing a video game? Will our troops and our public become even more disconnected from the human consequences and collateral damage of our actions?

To my great surprise, I found out that the levels of clinical distress in drone crews were as high, and sometimes even higher, than those of crews flying in real planes.
33
These were not desensitized video-game players wantonly dropping ordinance on digital-screen pixels that may as well have been bugs. They were soul-searching, confused, guilt-ridden young men, painfully aware of the lives they were taking. Thirty-four percent experienced burnout, and over 25 percent exhibited clinical levels of distress. These responses occurred in spite of the Air Force’s efforts to select the most well adjusted pilots for the job.

Air Force researchers blamed the high stress levels on the fact that the drone pilots all had combat experience and that the drone missions probably caused them to re-experience the stress of their real-world missions. After observing the way these pilots work and live, however, I’m not so sure it’s their prior combat experience that makes drone missions so emotionally taxing, as it is these young men’s
concurrent
life experience. Combat is extraordinarily stressful, but at least battlefield combat pilots have one another when they get out of their planes. They are far from home, living the war 24/7 from their military base or aircraft carrier. By contrast, after the drone pilot finishes his mission, he gets in his car and drives home to his family in suburban Las Vegas. He passes the mashed potatoes to his wife and tries to talk about the challenges of elementary school with his second grader, while the video images of the Afghan targets he neutralized that afternoon still dance on his retinas.

In one respect, this is good news. It means that we can remain emotionally connected to the effects of our virtual activities. We can achieve a sort of sync with things that are very far away. In fact, the stress experienced by drone pilots was highest when they either killed or witnessed the killing of people they had observed over long periods of time. Even if they were blowing up a notorious terrorist, simply having spied on the person going through his daily routine, day after day, generated a kind of sympathetic response—a version of sync.

The stress, depression, and anxiety experienced by these soldiers, however, came from living two lives at once: the life of a soldier making kills by day and the one of a daddy hugging his toddler at night. Technology allows for this dual life, this ability to live in two different places—as two very different people—at the same time. The inability to reconcile these two identities and activities results in digiphrenia.

Drone pilots offer a stark example of the same present shock most of us experience to a lesser degree as we try to negotiate the contrast between the multiple identities and activities our digital technologies demand of us. Our computers have no problem functioning like this. When a computer gets a problem, or a series of problems, it allocates a portion of its memory to each part. More accurately, it breaks down all the tasks it needs to accomplish into buckets of similar tasks, and then allocates a portion of its memory—its processing resources—to each bucket. The different portions of memory then report back with their answers, and the chip puts them back together or outputs them or does whatever it needs to next.

People do not work like this. Yes, we do line up our tasks in a similar fashion. A great waiter may scan the dining room in order to strategize the most efficient way to serve everyone. So, instead of walking from the kitchen to the floor four separate times, he will take an order from one table, check on another’s meal while removing a plate, and then clear the dessert from a third table and return to the kitchen with the order and the dirty dishes. The human waiter strategizes a linear sequence.

The computer chip would break down the tasks differently, lifting the plates from both tables simultaneously with one part of its memory, taking the order from each person with another part (broken down into as many simultaneous order-taking sections as there are people), and check on the meal with another. The human figures out the best order to do things one after the other, while the chip divides itself into separate waiters ideally suited for each separate task. The mistake so many of us make with digital technology is to imitate rather than simply exploit its multitasking capabilities. We try to maximize our efficiency by distributing our resources instead of lining them up. Because we can’t actually be more than one person at the same time, we experience digiphrenia instead of sync.

The first place we feel this is in our capacity to think and perform effectively. There have been more than enough studies done and books written about distraction and multitasking for us to accept—however begrudgingly—the basic fact that human beings cannot do more than one thing at a time.
34
As Stanford cognitive scientist Clifford Nass has shown pretty conclusively, even the smartest university students who believe they are terrific at multitasking actually perform much worse than when they do one thing at a time. Their subjective experience is that they got more done even when they accomplished much less, and with much less accuracy. Other studies show that multitasking and interruptions hurt our ability to remember.

We do have the ability to do more than one thing at a time. For instance, we each have parts of our brain that deal with automatic functions like breathing and beating while our conscious attention focuses on a task like reading or writing. But the kind of plate spinning we associate with multitasking doesn’t really happen. We can’t be on the phone while watching TV; rather, we can hold the phone to our ear while our eyes look at the TV set and then switch our awareness back and forth between the two activities. This allows us to enjoy the many multifaceted, multisensory pleasures of life—from listening to a baseball game while washing the car to sitting in the tub while enjoying a book. In either case, though, we stop focusing on washing the car in order to hear about the grand slam and pause reading in order to decide whether to make the bath water hotter.

It’s much more difficult, and counterproductive, to attempt to engage in two active tasks at once. We cannot write a letter while reconciling the checkbook or—as the rising accident toll indicates—drive while sending text messages. Yet the more we use the Internet to conduct our work and lives, the more compelled we are to adopt its processors’ underlying strategy. The more choices are on offer, the more windows remain open, and the more options lie waiting. Each open program is another mouth for our attention to feed.

This competition for our attention is fierce. Back in the mid-1990s,
Wired
magazine announced to the world that although digital real estate was infinite, human attention was finite; there are only so many “eyeball hours” in a day per human. This meant that the new market—the new scarcity—would be over human attention itself. Sticky websites were designed to keep eyeballs glued to particular spots on the Internet, while compelling sounds were composed to draw us to check on incoming-message traffic. In a world where attention is the new commodity, it is no surprise that diagnoses of the formerly obscure attention deficit disorder are now so commonplace as to be conferred by school guidance counselors. Since that
Wired
cover in 1997, Ritalin prescriptions have gone up tenfold.

Kids aren’t the only ones drugging up. College students and younger professionals now use Ritalin and another form of speed, Adderall, as “cognitive enhancers.”
35
Just as professional athletes may use steroids to boost their performance, stockbrokers and finals takers can gain an edge over their competitors and move higher up on the curve. More than just keeping a person awake, these drugs are cognitive accelerators; in a sense, they speed up the pace at which a person can move between the windows. They push on the gas pedal of the mind, creating momentum to carry a person through from task to task, barreling over what may seem like gaps or discontinuities at a slower pace.

The deliberate style of cognition we normally associate with reading or contemplation gives way to the more superficial, rapid-fire, and compulsive activities of the net. If we get good enough at this, we may even become what James G. March calls a “fast learner,” capable of grasping the gist of ideas and processes almost instantaneously. The downside is that “fast learners tend to track noisy signals too closely and to confuse themselves by making changes before the effects of previous actions are clear.”
36
It’s an approach that works better for bouncing from a Twitter stream to a blog comments field in order to parse the latest comments from a celebrity on his way to rehab than it does for us to try to solve a genetic algorithm.

But it’s also the quiz-show approach now favored by public schools, whose classrooms reward the first hand up. Intelligence is equated with speed, and accomplishment with the volume of work finished. The elementary school where I live puts a leaf on the wall for every book a child reads, yet has no way of measuring or rewarding the depth of understanding or thought that took place—or didn’t. More, faster, is better. Kids compete with the clock when they take their tests, as if preparing for a workplace in which their boss will tell them “pencils down.” The test results, in turn, are used to determine school funding and teacher salaries. All children left behind.

The more we function on the level of fast learning, the less we are even attracted to working in the other way. It’s a bit like eating steak and potatoes after you’ve had your chocolate. And the market—always acting on us but especially so in the artificial realm of the Internet—benefits in the short term from our remaining in this state, clicking on more things, signing into more accounts, and consuming more bytes. Every second we spend with just one thing running or happening is a dozen market opportunities lost.

The digital realm, left to its own devices or those of the marketplace, would atomize us all into separate consuming individuals, each with our own iPhone and iTunes account, downloading our own streams of media. Our connections to one another remain valuable only to the extent they can be mediated. If we are kissing someone on the lips, then we are not poking anyone through our Facebook accounts.

Of course, many of us try to do both. I have a friend, a psychotherapist, who often conducts sessions with patients over the phone when circumstances prevent a live meeting. What concerns me most about these sessions is the freedom he feels to send me text messages during those sessions, telling me to check out something he just noticed online! After I recover from the distraction from my own workflow, I start to feel depressed.

Other books

The Pink and the Grey by Anthony Camber
Paws and Whiskers by Jacqueline Wilson
The Titanic Secret by Jack Steel
Hunters: A Trilogy by Rice, Paul A.
The Art of Adapting by Cassandra Dunn
Electrified by Rachel Blaufeld, Pam Berehulke
Scardown-Jenny Casey-2 by Elizabeth Bear
A Cry For Hope by Rinyu, Beth
La colonia perdida by John Scalzi