The Blind Giant (12 page)

Read The Blind Giant Online

Authors: Nick Harkaway

BOOK: The Blind Giant
7.12Mb size Format: txt, pdf, ePub

While we’re on the topic of Newton, it’s worth observing that even science is somewhat influenced by the dominant societal perspective, however much researchers and theorists themselves
seek to avoid it. Newton apparently drew the idea of gravity between planets from his study of the alchemical notion of the attraction of souls; he was culturally ready to consider gravity in that particular way before he began to do so. If he had evolved his understanding of gravity from a different background, would he have seen its function differently, providing us with a different aspect of the interactions of gravity with the rest of the forces at play in and creating space and time?

So as we look at the relationship between society and digital culture – a distinction that is in any case pretty hard to make in any other than a rhetorical way, digital culture being an aspect of society, not an invasion from an alternate reality – it’s worth remembering that they make one another, and things that are attributed to the digital realm very often are actually just more blatant examples of things that are happening all over. The reason this book even exists at all is that our modern world seems to be completely permeated with digital devices and digitally mediated services and it can seem as if all the machines are taking over, or, at least, as if we’re being changed for good or ill without really seeing how this is happening. I’d say that wasn’t right: we’re changing, for sure – and we always have and we should hope that we always will – and some of those changes are contingent in their form on technology.

But that’s not to say that technology is the root of what’s happening. It isn’t. We are, as part of a cycle of development and change. And that false separation of us from our technologies – whether those technologies are physical ones like the iPhone or satellite television, or mental and societal ones like investment banks and governments – lies at the heart of a lot of what makes us unhappy and afraid in our world. One of the great benefits of digital culture is the growing awareness that we are not separate from one another or from the institutions we have made to do things for us. We are our technology. We just have to reach out and take charge of it, which is vastly easier to do when you know
there are 200,000 people thinking very much the same. Twitter isn’t about letting your favourite movie star know that you daydream about him when you’re brushing your teeth. It’s about knowing what everyone else is thinking throughout the day and seeing your own opinion resonate – or not – with a large group. And from that knowledge can come a campaign to save a TV show, or a student protest, or a revolution.

Technology, used in the right way and with the right understanding, makes us more who we are than we have ever been. It has the potential to allow us, not to take back control of our lives and our selves, but to have that control in some degree for the first time ever. Hence, this is a moment of decision – a moment we have been moving towards for a while. We have to choose to take control.

4
The Plastic Brain

I
N 2011
B
ARONESS
Susan Greenfield told Britain’s House of Lords that she feared the immersion of children in ‘screen life’ could be detrimental to their development. She also expressed a more general concern as to what our relationship with digital technology was doing to us as a society and as individuals. As Professor Greenfield explains it – behind the title Baroness is another, more conventional one: she is Professor of Synaptic Pharmacology at Lincoln College, Oxford, and a specialist in the physiology of the brain – the structure of the individual human brain is determined by genes, environment and practice. Who you are accounts for a certain amount, then it’s where you are and what you do with yourself. The degree to which we can control the first two is questionable – but not so, the third. That is directly affected by how you spend your time, and her fear was and is that the use of digital technology in our society is potentially harmful. One of her chief concerns is that we will become ‘all process’: that we will cease to connect events in a narrative and live from moment to moment, gratification to gratification. Another is that our social interactions will suffer, becoming performative – done in order to be reported – or inauthentic, geared to the screen and not the flesh.

In some ways it’s a familiar worry: when I was younger, it was suggested that television would turn us all into human lab rats endlessly pushing the ‘pleasure’ button. In others, it’s a far more serious notion, proposing that we may accidentally climb back
down the ladder of brain evolution to a new version of pre-literate culture and existence while we outsource our serious thinking to machines, remembering things by storing them, letting machines – or, rather, software running in machines – make administrative decisions about utilities, tell us what to buy and what to like, what political parties best represent our interests, who to talk to and who to be friends with. Professor Greenfield is at pains to say that her concerns are theoretical rather than based on strong research evidence, and indeed that research is precisely what she proposes the government should undertake.

In the same vein, Nicholas Carr (like Sven Birkerts in
Gutenberg Elegies
) warns of the death of ‘deep reading’ – the focused, single-tasking, immersive style of reading he remembers from the days before the intrusion of the Internet. He feels that we are passing through a shift in the way we think, and mirrors the concerns expressed more gently by Dr
Maryanne Wolf (Director of the Center for Reading and Language Research at Tufts University) in her book
Proust and the Squid
, that this shift in how we live and work will change the architecture of the brain itself, and thereby alter what it means to be human.

It sounds dramatic, but the brain is a versatile and even to some extent a volatile organ. It does, even in adulthood, alter its shape to take on new skills and abilities at the cost of others. The phenomenon is called ‘neuroplasticity’, and it is actually – to a layman’s eye – remarkable. By way of example: the anterior hippocampus – the region associated with spatial memory and navigation – of a London taxi driver, seen in a magnetic resonance image, shows pronounced enlargement.
1
Taxi drivers learn the streets and the flow of traffic, and that learning is reflected in the actual physical structure of their brains. In fact, whenever you learn a new skill, the brain begins to devote resources to it. Practice may not make perfect, but it does increase your aptitude for a particular task by building the area of the brain responsible for executing it.

Perhaps the most extreme example – if not in terms of neurophysiology then certainly of practical application of the brain’s adaptability – is an American man named
Daniel Kish. Kish is something of a phenomenon himself: born with a cancer of the eye, he has been completely blind since before he was two. He functions to all intents and purposes as if he can see, however – riding a mountain bike, identifying different objects at a distance, moving with confidence through space – by using echolocation. Kish actually clicks his tongue and uses his hearing – his ears are biologically ordinary – to receive a signal telling him where he is and what is around him. He has learned to interpret this so accurately that he can weave through traffic on his bike. He cannot, obviously, use this skill to read printed text or perform any other task specifically geared towards perception using light. On the other hand, his perception is not restricted to the normal field of vision. He has also passed on the skill to a new generation of echolocators; this is not something specific to Kish, however remarkable he may appear. It’s an ability you can learn.
2

Having said that, it is important not to overstate the extent of neuroplasticity. Steven Pinker, author and Johnstone Professor of Psychology at Harvard, points out in
The Blank Slate
that ‘most neuroscientists believe that these changes take place within a matrix of genetically organised structure.’ However impressive the flexibility of the brain, there are limits. ‘People born with variations on the typical plan have variations in the way their minds work … These gross features of the brain are almost certainly not sculpted by information coming in from the senses, which implies that differences in intelligence, scientific genius, sexual orientation, and impulsive violence are not entirely learned.’ The question is how far the smaller changes within the brain can take one’s identity before the brick wall of genetic structure is reached.

The issue for Carr, Greenfield and others is that we may unknowingly be moving away from the very development that
made us what we are. Reading is an act of cognition, a learned skill that is not native to the brain. We are not evolved to be readers. Rather, the brain reshapes itself to meet the demands of the reading skill, forming connections and practising it – just as you’d practise throwing and catching – until it is instinctive. You begin by spelling out words from letters, then ultimately recognize words as whole pieces, allowing you to move through sentences much faster. That moment of transition is the brain reaching a certain level of competence at the reading operation – or, rather, at the conventional reading operation, in which the reader consumes a text that is inert. Ostensibly, at least, traditional text cannot be re-edited on the go and contains no hypertextual connections that might distract you from concentrating on what is there and incorporating the information in it into your mind, or imagining the events in a fiction.

Text in the age of digital technology is somewhat different. It is filled with links to other texts, which the reader must either follow or ignore (a split-second decision-making process that, according to Carr, breaks the deep state of concentration that is at the core of the reading experience, however briefly). Worse, the text is in competition with other media in the same environment – the device – so that email, phone calls and Twitter can interrupt the smooth uptake of what is on the page; and that’s not just an issue for anyone who wants to read a thriller without losing the thread. Reading – not in the cultural sense, necessarily, though that’s no doubt a part of it – has had a profound effect on us as individuals and therefore on our societies.

The evolution of
reading and writing, in concert with our own, seems to have triggered a subtle but vastly significant change in what it means to be human, allowing a greater sense of separation from one’s own knowledge and a greater sense of the individual self. Some thinkers suggest that written language defined a new age of the singular individual, where before our thought was more immediately experiential and our sense of self
was fuzzier, more identified with the group. Written and read thought allowed us to see ourselves and our ideas from the outside, and began the long journey to technological society. What, then, will happen to us if we abandon it?

Apart from anything else, a recent study by researchers at the
University of Buffalo suggests that reading increases empathy – or even teaches it. On the one hand, the experiment is slightly alarming: reading
Stephanie Meyer’s vampire novels causes you to identify more closely with words like ‘blood’, ‘fangs’ and ‘bitten’, which seems to imply that readers are empathizing with the indestructible and tortured undead; but I did that when I was fifteen and it doesn’t appear to have warped me too much. On the other hand, it seems that what is learned is the forming of an emotional connection in general rather than the creation of a connection just with those characters.
3
What isn’t clear – I suspect because it’s outside the scope of the study – is whether this is a consequence of reading specifically or of concentrating on a narrative in any form. Does this effect not occur with film or video game narratives? Perhaps not: those forms are apprehended directly through the senses rather than being taken in cognitively, so maybe there is a difference. Then again, perhaps there isn’t. But the spectral possibility that reducing the amount of simple, disconnected reading we do might also reduce our capacity to empathize is worth spending some time and government money to rule out.

This kind of concern – like many others in the digital debate – is familiar.
Plato records
Socrates inveighing against the notion of mass literacy, reportedly worried that if the population could read and write, they would cease to bother to remember. Their thinking might be jeopardized, too, as the new technology of writing created in them a kind of false consciousness, a simulated cognition derived from what they read rather than a real one
produced by consideration of the issues from first principles. It might seem outlandish – except that it’s exactly the same as the one we’re discussing now – but if some modern notions of our brain’s history are an accurate depiction of what happened, then Socrates was absolutely right. He was even right to imagine that the nature of thinking would be fundamentally altered by literacy. But he was wrong – at least superficially – in his dire prediction of a society made ignorant by letters.

Other books

Post-Human Trilogy by Simpson, David
Finding Emma by Holmes, Steena
A Prayer for Blue Delaney by Kirsty Murray
Dead Girl Walking by Linda Joy Singleton