The Blind Giant (14 page)

Read The Blind Giant Online

Authors: Nick Harkaway

BOOK: The Blind Giant
4.94Mb size Format: txt, pdf, ePub

One of the other accusations made against digital technologies is that they foster depression, attention deficit disorders and other
mental problems; but while computer games, online or not, have for years been scored in some magazines (and subsequently, of course, online) on a scale of ‘addictiveness’, and while some users report themselves as addicts, the notion of ‘Internet Addiction’ as an actual condition remains in doubt. There are various clinics that will help you to beat it, but the
Diagnostic and Statistical Manual –
the standard reference document of mental medicine – does not unequivocally acknowledge its existence. Many doctors argue that it is in fact a symptom of other, underlying disorders, and should be treated as such.

In 2007 Dr
Vaughan Bell of King’s College, London, wrote in the
Journal of Mental Health
that although there were studies showing ‘pathological internet users’ reporting lower self-esteem, greater depression and suicidal impulses, loneliness and withdrawal, it was hard ‘to infer a direction of causality, and it is just as likely that anxious, lonely or depressed people might attempt to alleviate their distress by seeking online resources for entertainment, interaction, and sexual gratification.’ Bell also notes that while an initial study in 2001 found a small increase in loneliness, a follow-up found the reverse. It may also be that already extroverted people report greater involvement and connection from Net use, and introverted people report greater loneliness. In other words, the Internet on one reading is an exaggerator of pre-existing tendencies, rather than a push in a particular direction.
5

Regarding
ADHD,
Sir Ken Robinson, the renowned educationalist, expresses strong objections to medicating children for ADHD on the scale presently common in the US and increasingly common in the UK: ‘Our children,’ Robinson said at an RSA talk in 2010, ‘are living in the most intensely stimulating period in the history of the earth … And we’re penalizing them now for getting distracted. From what? … Boring stuff.’ Moreover, children – and indeed adults – presently exist in a world where they are the subject of a sophisticated assault on
their senses intended to attract their attention, that being, of course, the primary currency of many business models in the digital age. Attention has been studied, codified and tested; the desire to return to a given activity – be it playing Farmville or watching reality TV – is the yardstick by which many companies now measure success. In other words, until education and daily life, especially in the workplace, are enlivened by these kinds of considered, supercharged, attention-grabbing strategies, or until all parents are able to be ridiculously fascinating to children and teenagers (good luck) a certain amount of attention drift is going to be inevitable.

One of the most necessary skills in a time when information is all around us is the ability to pick one topic and follow it through the noise. That skill may well only be gained from having to do it; the question then becomes, of course, one of motivation – or parental engagement. And indeed, as Nicholas Carr concedes, there is evidence of some gains in this area; he points to a study that showed that British women searching online for medical information were able to make an accurate judgement in seconds of whether a given page contained useful information. Consider, for a moment, the general applicability of the ability to sort fluff from gold in the world we live in.

At the same time, what the detractors of digital media see in social networks and digital technologies is not what I see: Susan Greenfield’s description of ‘screen life’ is a lonely and empty one in which users of Twitter post statements about themselves in an endless quest for almost existential reassurance that they matter; they act only in order to record the action and share it. They emit trivia in an endless, pointless stream of self-cataloguing and sub-gossip, from the moment they brush their teeth in the morning to the moment they fall asleep again. They exist to see themselves reflected, and live in retreat from physical socializing, and from a sense of self. It’s the beginning of the nightmare world; and yes, if it’s emerging from our interaction with digital
technology, that would represent a kind of modern version of
Henry David Thoreau’s statement that most of us live lives of quiet desperation.

According to this picture, the arrival of a new medium has allowed us to stop being so quiet about our horror, and scream it instead into a strangely comforting void that echoes it back to us and tells us we are not alone. I don’t see, incidentally, that that would be so damning a statement of Twitter. If its sole function were to soften the crushing weight of human pointlessness, that would be fine by me. (I shouldn’t be flippant: if the picture is accurate, the relationship between human brain and machine in this context is actually an echo chamber causing a kind of atrophy, a shrinking of the self.)

There’s no suggestion, incidentally, that computer use is like smoking – that it is an inherently bad activity that will cause you to develop negative symptoms however you use it. Rather it is a question of balance. Anything that comes to occupy a dominant position in a person’s life may be problematic, be it a computer or an exercise regimen. In this case, the suggestion is that social media are displacing ‘real’ interaction, and the gratification of interacting with a computer, be it browsing or playing games, is displacing more fulfilling human activity. The negative effects on a person’s life of a game like Everquest – a precursor to the ubiquitous
World of Warcraft which was sometimes referred to by players as ‘Evercrack’ for its compelling quality – seem to detractors to be a cause for concern just as much as anything more regularly thought of as addictive or damaging.

But that isn’t the character of, for example, Twitter as I’ve experienced it. First of all, the information that someone on the other side of the world is brushing her teeth holds no interest for most of us. Such a stream of drivel would likely be met with silence at best. More, though, while Twitter was originally conceived as a micro-blogging site – the input box still invites you to tell the world what you’re doing right now – it has become in
the hands of its users something different. Far from being a collection of disconnected individuals yowling into the night, it’s a sea of communities, loose-knit but very engaged and very real. There is research that suggests that users of the Internet and social media sites are less alone than otherwise, and that Facebook, for example, is a tool for the maintenance of relationships rather than a replacement for them.
6
Twitter, in my life, is also a place to seek, receive and impart information and ideas regardless of frontiers.

If that last phrase sounds familiar, it’s because it comes from Article 19 of the Universal Declaration of Human Rights. Twitter, after all, was part of the revolutions in the Middle East in early 2011. It wasn’t the reason they happened, but it was a factor. The idea that it is primarily a vehicle for banality seems a little ungenerous.

I use Twitter as a research tool and as what my wife describes as my ‘office water cooler’: in breaks between bouts of work, I can bring up my Twitter page, send out a few goofy messages, discover what the wider world is thinking about (I follow the feeds of a large number of people, many of whom hold opinions I think are silly, wrong-headed or even just obnoxious) and exchange a few good-natured jibes with fellow writers, publishing folk or whoever happens to be around. I ask questions like ‘What was that military exercise where the US had that marine guy who handed the conventional forces their heads on a plate?’ and a few moments later receive the answer. In other words, I don’t feel a need to tell Twitter about the detail of my life; I do sometimes report on myself, but not endlessly or (I hope) tediously; I share ideas, seek answers, encourage others, occasionally assert dissent. I don’t need to be reflected – in fact, I want to encounter difference – although, yes, from time to time I’m glad of the reassurance that comes my way from my peers.

If I’m distracted by it, that distraction is generally a profitable and useful moment of relaxation before I re-enter the hugely
enjoyable yet exhausting state of concentration that I go into for creative production. I want to know that there are other people in the world beyond the walls of my office, and talk to them before I try to finish a chapter. I am aware, however, that this is not typical, and there are others who do use social media to avoid contemplation of a frightened, diminished self, but the same can be said of other activities more generally considered ‘wholesome’ such as sports, socializing, watching theatre. As with so many other human activities, whether social media are ‘bad for you’ seems to depend a great deal on the pre-existing circumstances of your life, your emotional and psychological well-being.

In Japan, there is a word:
hikikomori
. It means ‘withdrawal’, and signifies a group of people who have literally closed the door on the outside world and gone into seclusion. They are mostly but not by any means entirely young men – according to the
New York Times
, 20 per cent of them are female – and they often communicate only via computers or phones.
Hikikomori
have issues of self-disgust, but also feel the world pressing in upon them, observing them (paranoia is often seen as an indicator of issues of insignificance and powerlessness). The American writer
Michael Zielenziger has likened the problem to post-traumatic stress, while others have compared it to anorexia: the leading Japanese researcher, Dr
Tamaki Saito, suggests that it begins with a desire to ‘stop growing up’. Interestingly, the Japanese do not link it causally with digital technology; instead, they see it as part of a wider malaise, and some argue that it is cultural rather than purely psychological; in other words that it is not a mental illness, but a strange reaction to a situation that – if only in a certain light – makes a kind of sense.
7

It’s hard not to wonder about that in the context of the riots in London and the rest of the UK in the summer of 2011. Those ugly outpourings were described at the time as ‘pure criminality’, an explanation that carefully explains nothing. Depending on who you talk to, Japan is only just recovering from one or even
two ‘lost decades’ resulting from an economic crash precipitated by an investment bubble – a national trauma. As the European economy dances along the edge of collapse and Britain’s banks look nervously at their continental liabilities, it’s not unreasonable to ask what form a sense of hopelessness might take in the United Kingdom after a twenty-year dip in prosperity. And would we (do we) blame our own
hikikomori-equivalents
on the Internet rather than looking more deeply?

The word people use for someone who chooses not to socialize much is ‘introverted’, or sometimes ‘introspective’. But
introspection, the cognitive consideration of the self, is one of the things detractors of the digital realm are concerned may be slipping away in a tide of stimulation and chatter, casualty of a dependence on seeing oneself reflected in the eyes of others. Certainly, it’s not inconceivable that individuals might suffer this, though I question whether society as a whole is likely to do so (unless society as a whole has always done so). Moreover, this is a perception which relies for its weight on the idea that we must be whole and complete as single individuals, and that our self-perception must come from within from a kind of interrogation and self-scrutiny tied to the capacity for abstract thought derived from the relationship with text. It’s an idea that pervades the objections to social media and the world of hypertext (as opposed to static, printed text): that self is a private journey, and humans are individual and alone.

For what it’s worth, that idea probably also lies at the root of the individualism in our financial services which has recently produced some unpopular results. But it’s not necessarily an accurate perception of what it means to be human. That question requires a little closer examination, which we’ll come to. In the meantime, file under ‘uncertain’ the notion that any of us ever develops our sense of who we are
without
constant reference to the people around us.

That’s not to say that the use of digital technology doesn’t
have consequences for how we work, and, indeed, how we think. A study by Assistant Professor
Betsy Sparrow at Columbia University, reported in
Science
magazine, found that we have incorporated the possibility of Googling answers into our mental model of the world; our brains assume the possibility is there and don’t bother to store for longer-term retrieval facts that we know we can get through a search engine. Google has taken its place among our tools and become part of our way of thinking. However, the study also found that if we know in advance that we won’t be able to access those facts, we tend to remember much better. It seems we still have the option – and the trick – of remembering information.
8

The issue with digital technology is not that interaction with it inevitably is good or bad, or has a particular effect, but rather that some effect is likely – just as it would be if you spent a great deal of time shooting basketball hoops, listening to music, or tasting fine wines – and the nature of that change depends greatly on the way you approach the interaction; in a very real sense, mood is a significant factor. If you choose to be passive or are predisposed to passivity, the technology can work with that, and you will practise passivity and get good at it. If, on the other hand, you approach your Internet use actively, you’ll get better at being intellectually and emotionally engaged. It seems that the introduction of digital technology into a given arena strengthens and makes obvious patterns that were already present.

So why do some people see digital technology – or perhaps any technology – as a malign force?

And, conversely, why do others become so irate at the very notion that anything to do with the Internet could be bad?

Other books

Kingdom Keepers VII by Pearson, Ridley
Your Worst Nightmare by P.J. Night
101 Slow-Cooker Recipes by Gooseberry Patch
Feral Curse by Cynthia Leitich Smith
Becoming Bad (The Becoming Novels) by Raven, Jess, Black, Paula
The Onus of Ancestry by Arpita Mogford