The Winter of Our Disconnect (25 page)

BOOK: The Winter of Our Disconnect
6.26Mb size Format: txt, pdf, ePub
When McCann Erickson created the “Make Facetime” campaign for Dentyne brand-owner Cadbury in September 2008, they were aiming straight for the kisser. The idea that the ads could induce under-twenties to swap their technology for a stick of gum and a good old-fashioned chin-wag was always going to be hard to swallow . . . not to mention impossible to digest. (“I think most college kids would roll their eyes,” commented one sociologist drily.) But the fact that it was tried at all is interesting. So is the site itself, which includes a couple of not-entirely-user-friendly social-networking utilities—well, they stumped this Digital Immigrant—and something called the Smiley Chamber of Doom, which shows animated emoticons being maimed and tortured. (I’m down with that. ☺) Oh, and it really does cut out after three minutes. Which is kind of amazing and also kind of annoying—especially if you happen to be taking notes.
Closer to home, Dôme Coffee—the Australian-owned café chain—is running a series of magazine ads with a strikingly similar anti-tech /pro-talk theme. “One friend face to face beats 100 on Facebook,” reads a Confucius-like headline on a recent full-page ad. It features a photo of a cluttered lunch table and two broadly smiling women friends in their thirties ... s
taring at mobile phone screens.
(I show the ad to Suss. “See anything wrong with this picture?”
“Is it something about feminism?” she asks, warily.)
The information paradox—that the more data we have, the stupider we become—has a social corollary, too: that the more frantically we connect, one to another, the more disconnected our relationships become. We live in an age of frenzied “social networking,” where up to a quarter of us say we have no close confidante, where we are
less
likely than ever before to socialize with friends or family, where our social skills and even our capacity to experience empathy are undergoing documentable erosion.
Our, quote-unquote, family rooms are docking stations now. We have five hundred or six hundred “friends,” and no idea who our next-door neighbors are. We affiliate with “communities” based on trivia—a mutual appreciation of bacon, a shared distaste for slow walkers. And doing so in a spirit of heavy-handed irony hardly ennobles the enterprise. We have sold out social depth for social breadth and interactive quality for interactive quantity to become what playwright Richard Foreman calls “pancake people”: “spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”
2
Or at least that’s one side of the argument. There are others who argue that our social connectivity is not fraying at all, but simply undergoing some much-needed rewiring. They point to the growth of online communities—from social-networking utilities such as the fascinatingly telegraphic Twitter, to the entire virtual worlds of Second Life and World of Warcraft. They show how new media are bringing families together with instantaneous digital contact via text, sound, image—or all three at once. (“Have you Skyped Grammy and Grandpa to say thank you for that birthday money yet?”) They remind us that for Digital Natives, time spent grooming one’s online relationships on Facebook or Twitter alone amounts to a sizable part-time job. That the happy confluence of wireless Internet and portable media means they are
never
alone, never out of touch. “Only connect” is what these people
do
.
So, are we really more connected and less alone than ever before? Perhaps the truth lies somewhere in the middle. But I don’t think so—and maybe that’s why it’s all so confusing. My own observations suggest that the truth lies at both extremes. “Information explosions blow things up,” remember? In this case, the land mine seems to have taken out the Via Media (literally, the middle way) altogether.
We are both much, much better connected,
and
in clear and present danger of forgetting how to relate. Well, I guess that’s why they call it a paradox.
Watching as my kids adjusted to the aftershocks of life without social media drove the point home again and again. Maybe a night on Facebook really has become the moral equivalent of standing around the piano singing show tunes. But while the quality of each experience, and the skills and habits they call into play, are both certifiably “social,” they also happen to be certifiably antithetical. Messaging, poking, posting, uploading, and gifting faux livestock to your friends can be absorbing, entertaining, even challenging. But “getting together” it ain’t. I knew that before, of course. But The Experiment just sort of turned up the volume—not just for me, but for all of us.
The impact on our relationships as a family was even more dramatic, as we found ourselves “tuning in” to one another in unexpected ways. We lingered more around the dinner table—and talked. We watched the fire together—and talked. We pulled out old photo albums—and talked. We played board games—and talked. We climbed into one another’s beds and read the paper—and talked. Are you getting my drift? Realizing that, to quote Anni, “There are people here. Let’s talk to them!” came as an epiphany to all of us, I think. For me, that provoked guilt and delight in almost equal measure. But hey ... isn’t that what being a parent is all about?
 
 
Conversation, studies show, is good for the brain. “No, duh!” as Sussy would say. But these days, it seems, some of us need convincing. According to UCLA neuroscientist Gary Small, talking to people face-to-face—as opposed to face-to-Facebook—provides “greater stimulation for our neural circuitry than mentally stimulating yet more passive activities,” including reading.
3
A 2008 study found that subjects who’d spent ten minutes chatting with friends scored better on memory tests than those who’d spent the same amount of time watching TV or reading a book, and the same went for those who’d engaged in “intellectual activities,” in this case, solving puzzles.
4
Think about it. More time spent in face-to-face conversations
could
mean your child remembers where he left his laptop charger.
Online chatting, on the other hand, has been linked to symptoms of loneliness, confusion, anxiety, depression, fatigue, and addiction. Says Small, “The anonymous and isolated nature of online communication does not provide the feedback that reinforces direct human interaction.”
5
A study published in the journal
CyberPsychology and Behavior
found that shy people spent significantly more time on Facebook than more outgoing individuals—although they had fewer “friends”—and enjoyed it more too. The possibility of “a reliance of shy individuals on online communication tools” concerned the researchers, psychologists at the University of Windsor in Ontario, Canada.
6
Half a world away, Japanese psychologist Tamaki Saito has coined the term
hikikomori
to describe a new breed of young social isolates. The Japanese Ministry of Health defines
hikikomori
as “individuals [80 percent are estimated to be male] who refuse to leave their parents’ house, and isolate themselves away from society and family in a single room for a period exceeding six months.” But the definition leaves out an important fact:
Hikikomori
are often, paradoxically, the most “connected” individuals in Japanese society.
Many
hikikomori
sleep by day and spend their nights watching manga, gaming, and surfing the Net, surfacing only to sneak into the kitchen for food while the family sleeps. An entire industry has sprung up to address the phenomenon—from parent support groups to online counseling services—but the epidemic continues to rage. In July 2009, Osaka police attributed to
hikikomori
a spate of street attacks by “apparently troubled people venting their frustration on total strangers.”
7
One young man admitted he didn’t care who he killed. “I’d grown tired of life,” was his only defense. Experts believe
hikikomori
may turn to violence because of their lack of social skills. “Once they come to be considered weird, they prefer to be alone rather than feeling awkward among other people,” explains Toyama University academic Yasuhiko Higuchi. “They then commit an extreme crime after magnifying their stressful thoughts and having no one to talk to.”
8
Many
hikikomori
have failed to form a proper relationship with their parents, he adds. Uh-oh.
Pre-Experiment, I mention the term
hikikomori
casually to Bill. He looks up briefly from his game—where a strapping youth is beating the cyber-crap out of a hulking avatar with incongruously girlish hair—and looks down again. “You’re pronouncing it wrong,” he mutters.
Turned out Bill knew all about
hikikomori
. In fact, he’d watched an anime series about them. “Really? Where’d you get that from?” I asked. “Um,
Japan
,” came the reply, the “duh” unspoken but implied. He’d downloaded the show from a file-sharing site. “So, what do you think of them, then?” I asked, in that annoying faux naive manner beloved of therapists, parole officers, and mothers.
“I think they’re cool,” he replied evenly. (Like most fifteen-year-old boys, he could spot a cautionary tale at twenty paces.)
“Are you kidding?” I sputtered. “They’re
mentally ill
! They have
no life
!”
He looked up once more, between body blows. Maybe he didn’t
say
, “It takes one to know one,” but it was there in his eyes.
I was still enjoying intermittent eye contact with my children (although Sussy, aka Thunder Thumbs, was developing an alarming facility for texting while doing just about anything: talking, eating, walking, and more than once, I swear, during REM sleep) but you didn’t need to be a detective with the Osaka police force to notice that our opportunities for sustained sharing had become increasingly nasty, short, and brutish. Their online world had become “the point”—of existence, I mean—and every other kind of interaction constituted a tangent. An interruption. I was conscious of how often I approached them with words like, “Can you just pause that for a moment and . . .” or, “After you sign out, would you . . .” or, “I don’t need you to log off, but . . .” It was as if life, real life, were a game they’d lost interest in after the first couple of levels.
Looking around our family room, at the children sitting frozen at their screens, I would be reminded of Swiss architect Max Frisch’s definition of technology: “The knack of so arranging the world that we don’t have to experience it.”
Relating socially, whether one to one or in groups, seems so fundamental to human nature. The notion that we might need to practice these skills—to practice being human, really—seems odd to me, and perhaps to you too. But neuroscientific evidence reminds us that the pathways in the brain that facilitate interpersonal skills, empathy, and sound social instincts are created, not born. In the case of individuals “who have been raised on technology, these interpersonal neural pathways are often left unstimulated and underdeveloped,” observes one expert.
9
Despite their higher IQs and bulging thumb muscles, in other words, The Young and the Listless do show deficits in basic social skills such as empathic listening, and interpreting and responding to nonverbal cues in conversation.
Some observers have gone so far as to suggest technology may be driving us all toward a kind of social autism—wrapped safely but suffocatingly in our digital bubble wrap, uninterested in and/or threatened by the world outside, and supremely ill equipped to deal with it. Alarmist though it may sound, it’s not entirely far-fetched. In fact, recent research suggests there may even be a link between chronic technology use and
clinical
autism.
Whatever the cause, autism rates have skyrocketed during the digital age. Today, according to figures from the European Union Disabilities Commission, autism afflicts one in every fifty-eight children—an increase of up to 500 percent since records have been kept. Many theories have been advanced to explain the epidemic; almost all have been disproved. One that has not is the theory that started out as a parent’s hunch.
Michael Waldman, an economist at Cornell University, was devastated when his two-year-old son was diagnosed with autism spectrum disorder. But he was also skeptical. He’d noticed that since the birth of their second child a few months earlier, his son had been spending more and more time watching television. Privately, he wondered whether the boy’s socially phobic behavior was not a “disorder” at all, but simply an aggravated case of tuning in and . . . well, tuning out.
Waldman placed restrictions on the child’s media habits and had him retested. When his “condition” improved and then disappeared entirely, it seemed like a miracle. But economists, thankfully, don’t believe in miracles. Waldman cast about for a way to study his hunch about a link between autism and television viewing. And finally the answer came to him: rainfall data. Stay with me on this one.
Waldman reasoned that kids in rainier climates watch more TV—which is true, by the way—and therefore that regions with higher-than-average precipitation might also feature higher-than-average rates of autism spectrum disorder. He compared California, Oregon, and Washington—the rainiest states in the United States—with the rest of the nation, and he found his answer. There
was
more autism in these states. He then looked at only those families who had cable TV subscriptions in these high-precipitation regions, and the correlation was higher still.
10
When Waldman’s study was published in the November 2008 issue of the prestigious
Archives of Pediatrics & Adolescent Medicine
, it provoked a perfect thunderstorm of abuse and criticism. But the data remain standing. A 2009 article in the
Journal of Environmental Health
concedes Waldman’s point about the link between rainfall and autism rates, but is more equivocal about causes. Perhaps the real culprit was not TV at all, but vitamin D deficiency, or increased exposure to household cleaners?
11

Other books

A Fool's Alphabet by Sebastian Faulks
The Fire Sermon by Francesca Haig
Forever Viper by Sammie J
The Invisible Line by Daniel J. Sharfstein