The feeling I had looking at my first silverback gorilla in the wild was vertiginous. It was as if there was something I was meant to do, some reaction that was expected of me, and I didn’t know what it was or how to do it. My modern mind was simply saying, “Run away!” but all I could do was stand, trembling, and stare. The right moment for something seemed to slip away and fall into an unbridgeable gap between us and the gorilla, and left us gawping helplessly on our side.
Douglas felt deep in his bones that when he looked into the eyes of a gorilla he was seeing a fully conscious creature. Jane Goodall said that Douglas wrote very movingly of the gorillas’ lives in the wild and did not ask, like Bentham would have, can they reason? He asked instead: can they suffer?
The charities that are concerned with the welfare of the mountain gorilla and the rhino were delighted to have Douglas as a supporter. Indeed, he became a patron of Save the Rhino International, an energetic organization dedicated to preserving this amazing creature which is in great peril from loss of habitat and, more especially, the depradations of poachers.*
186
Tirelessly he proselytised among his well-heeled mates and his millionaire technology contacts about both causes,*
187
and managed to persuade Bill Gates, no less, to give $10,000 towards saving the silverback gorillas. This was a much greater investment of effort (and of vulnerability) than merely writing a cheque, not that there was anything mere about Douglas’s cheques.
In 1995, at the behest of the founders of Save the Rhino International, David Stirling and Johnny Roberts, Douglas was even persuaded to put on a rhino costume and go, accompanied by his sister, Little Jane, for a sponsored climb up Mount Kilimanjaro in Kenya. The rhinoceros has been around for forty million years, and we cannot invent any more of them when they are gone. This is true of all the animals on the CITES list of endangered species. Various kinds of macaws, gibbons, parrots, tortoises, snakes, tigers, rhinos—the list is far too long to be recounted here—will vanish from the world forever. (Inconceivably, there is a trade in stuffed baby tigers.) Douglas was dismayed that the disappearance of the white rhino is a real possibility, and for that he was willing to don the suit. This extraordinary contraption had been on an outing already in New York’s marathon. It had been designed by cartoonist Gerald Scarfe*
188
for a stageplay about saving the rhinoceros. William Todd-Jones, the Welsh puppeteer and actor, wore it in that production, which is how he also came to be on the sponsored walk. He made friskier progress than Douglas. The insides of the suit reeked of sweat and Dettol; wearing it in the African heat was a torment.
Douglas was very game and toiled along for miles and miles, slathered in sunblock so potent that it must have been the pharmaceutical equivalent of tinfoil. But he was too large and pink for the tropics; the rhino suit weighs 30lbs and the heat could reach over 100°F. He did not reach the top. Kilimanjaro, he explained, is the tallest mountain in the world for, although it is “only” 19,340 feet, it erupts from the ground virtually at sea level whereas Everest starts from the already over-achieving foothills of the Himalayas. He was particularly delighted in the response they got from the children they met on the climb. They shrieked with the kind of happiness, Douglas records, “that we in the West are almost embarrassed by.”
Douglas’s immersion in life in all its variety led him to some serious thinking about its place in the universe. Cosmology is a beautiful subject that concerns itself
—inter alia—
with the origins of the universe and its ultimate fate. Like it or not, these are questions that are also addressed by the religions of the world. Many cosmologists are embarrassed by this overlap, perhaps because they fear they will be tarnished by the touch of woolly religious thinking, or maybe because they are encroaching on areas not susceptible to scientific method. Yet they are the very questions on which they are most often pushed by popular sentiment to express an opinion. We have an appetite for such knowledge.
There is an argument running through the discipline that unavoidably lends itself to the use of that multiply freighted “God” word; it’s called the Anthropic Principle. The argument goes like this: the current state of the universe is the end product of an inconceivably long chain of causal connections. Every link in that chain happened because the conditions were right for it to do so—and not just right, but precisely right—fine-tuned to a degree of scarcely imaginable precision. For instance, if the acceleration of gravity by the tiniest margin were a different value, or ditto the charge on the electron, then our universe could not have evolved to the state we observe. Indeed if any of the fundamental constants*
189
recognized by science were changed even infinitesimally, our universe would be very different, or might not exist at all or, if it did, not for long enough to allow for complex evolutionary processes. Similarly if our planet had an orbit just a smidge further from or a little nearer to the sun, it would have been unsuitable for the evolution of life. If there had not been a planetary collision, the odds against being literally astronomical, forming our moon, and stabilizing the Earth’s axis of rotation, then it’s likely the planet would be sterile. And so on, through any number of benign coincidences. The world has to have the exact properties it has otherwise we would not be around to observe it.
The Anthropic Principle can be expressed in varying ways, but in its strongest version it looks at the cosmos and says: it’s too much to expect that these perfect conditions for the emergence of life are a coincidence. Life seems to be the purpose of the universe. There must have been a Designer. Of course, the designer need not be identifiable with any parochially defined deity (no angry old geezers on thunderclouds yelling “Thou shalt not”). It could be something more abstract, like initial conditions, or the laws of physics. But however hard you try to finesse it away, there appears to be, according to the argument, an organizing principle at work that favours life, something that gives the universe meaning.
Douglas hated the Anthropic Principle with its comforting notion of putting life, and man, centre stage in the infinity of space. Only Zaphod Beeblebrox, who had laughed in the Total Perspective Vortex, could be that arrogant. Yet a number of extremely smart physicists and astronomers believe something along the lines of the Anthropic Principle, though it has always struck me as a bit like saying that if you win the lottery (i.e. you are the beneficiary of a hugely random process) then you have somehow called the lottery into being by persuading yourself that it has been organized just for you.
He and I would sometimes discuss it, and wonder at its wishfulness. “Whatever happened to Occam’s Razor?” we would sigh, as we stuffed food into our faces. “The Big Bang stands on its own—any further entity we posit beyond it is unknowably hidden by the event horizon and hardly in a position to take much interest in our affairs. Surely the Anthropic Principle is a confusion between outcome and purpose . . . Another bottle, do you think?” (We did gossip about trivia too.) A strong urge to believe a proposition does not after all constitute evidence for it. The fact that if the proposition were true it would fulfil needs both obvious and subtle does not make it the case. Consider the proposition that you have a million dollars in the bank.
Once I gave Douglas a copy of
Before the Beginning,
by the astronomer Martin Rees. In that smashing book, Sir Martin suggests that there are sound theoretical reasons for believing that at the time of the Big Bang not just our universe, but an infinity of others, were created. Literally innumerable universes would not have been viable, so the fact that we inhabit one we can observe becomes less surprising. His analogy was that it’s like coming across a tailor’s shop with a billion suits; finding one that fits perfectly is not so remarkable.
Douglas worried away at this idea. He considered the universe to be more wonderful than any inevitably anthropocentric religious account of it could possibly be. His thought experiment about the Anthropic Principle was one of his favourites, occasionally appearing in print and featuring frequently in his lectures.
What if, he suggested, a puddle on a rock were by some fluke to stir into consciousness? Gosh, it would think, as it looked about itself: how perfectly I conform to this environment. There’s not a molecule out of place. This rock suits me precisely. Strewth, it couldn’t be a better fit if it were designed for me. Can it be a coincidence? I think not. Somehow I must be the whole purpose of this rock, part of a vast mountain range on an enormous planet I see. What an important puddle I am.
Then as the sun comes out, the puddle starts to evaporate. But even as it shrinks it continues to congratulate itself on fitting into its habitat with uncanny precision. Eventually the puddle, to the very end convinced of its central role in the existence of the universe, disappears without ever waking up to the bigger picture.
Douglas’s didacticism was always leavened by great doses of humour. As we’ve noticed before, in other circumstances he would have been an inspired teacher.
THIRTEEN
The Digital Village
“The best way to predict the future is to invent it.”
A
LAN
K
AY
“An assumption is something you don’t know you’re making.”
D
OUGLAS
A
DAMS
I
n 1990, the year Prime Minister Margaret Thatcher resigned (if that’s not too passive a word), Pan with Faber & Faber published the delicious
The Deeper Meaning of Liff
by John Lloyd and Douglas Adams. Like their first book, it contained hilarious definitions that would otherwise have been hanging about on street corners getting into trouble. Unlike the first, this one included maps devised by Trevor Bounford, an experienced designer, who underwent some pain when George Sharp, Pan’s Art Director, commissioned him. Trevor recalls: “The brief was to make maps that were deliberately unhelpful, which we managed to achieve despite years of training in producing the opposite.”
Then in 1992,
Mostly Harmless,
the fifth and final novel in the
Hitchhiker’s
“trilogy,” was published. The anguish of its creation has already been described. There were spin-offs of various kinds, but this was the last new book that Douglas published in his lifetime.
He had done ironic detachment. He was clean out of cosmic jokes. His hair was going rather thin on top. He was married at last. He even had an office. Now he would discharge his existing contract, give up the painful writing business, and become a futurologist or a games designer/computer consultant. Douglas had moved on.
The world was changing, and at a vertiginous pace. “Packet switching” so that computers could talk to each other in standardized units of information had been around for decades, and so had the various precursor incarnations of the Internet (like ARPANET), but their use had been largely limited to academics, scientists and the military. Then in 1989 Tim Berners-Lee at CERN devised a universal language (HTML) and the World Wide Web was born. In 1994, the first really powerful browser became available from Netscape. “Are you on-line yet?” was the question on the lips of every young urban professional, with the United States, Japan and Germany leading the way. The number of users grew exponentially.
Communications technology not only changed how business was done, and hence how whole economies functioned, but it also enabled people who were geographically, economically and culturally remote from each other to be united by shared interests and form communities that had never been possible before in the history of the world. Revolution is an abused word, but the technological changes of the last two decades of the twentieth century were revolutionary.
Ever since he had gazed like a love-struck adolescent at his first Macintosh in the offices of Infocom in Boston in 1983, Douglas had seen much of this coming. The legend is that he bought the first Apple Mac to be sold in the UK—and the second too. Stephen Fry claims he acquired the third. Douglas had gone to Boston to work with the renowned games programmer, Steve Meretzky,*
190
on the first
Hitchhiker’s
computer game and had seen the potential with remarkable prescience. Without any encouragement, he would go into lecture mode and—depending on their mindset—either bore or enthral his friends about the coming Cyber Age. He was passionately interested in all aspects of IT, and especially in the things it could do that the human mind could not.
Computers, for instance, have the ability to crunch through arithmetical calculations with much greater speed and accuracy than the human brain, and in staggering volume. This makes them ideal for sorting mountains of data according to instructions that must be drearily precise. Computers are just irritating machines that have no ability to construe the user’s intentions. But unless you take the view that in the end a quantitative change becomes a qualitative one, this capability is not different in kind from that of the human brain. Douglas was much more intrigued by those powers of the new technology that might represent an evolutionary step forward for the species. Famously he defined a computer by what it was not: not a television, not a typewriter, not a calculator, and certainly not a brochure (when linked to a website)—though it could certainly fulfil all those functions.*
191
What it is, he decided, is a modelling device. “Once we see that,” he wrote in
The Salmon of Doubt,
“we ought to realize that we can model anything in it. Not just things we are used to doing in the real world, but the things the real world prevents us from doing.”
Douglas wasn’t just talking about how a supercomputer could simulate the collision of galaxies, the flow of air over a wing, or the fission of an atom, all of which are for all practical purposes impossible to do in the real world with unaided brainpower. He was fascinated too by the computer’s ability to model the emergence of complexity by doing the same thing again and again very quickly. Douglas saw that the old paradigms of how we talk to each other were radically altered by IT. “One to many” communication abounds (telly, newspapers and so on) but the Internet for the first time makes “many to many” possible, allowing us to experience distributed intelligence. Even everyday equipment enables us to gain access to the resources of hundreds or thousands of other minds. What if, he wondered, we could carry a device with terabyte storage—just like the Hitchhiker’s Guide in fact—that was constantly updated in real time with information, experience, insights, reviews, jokes even, of a community of fellow owners of a similar device? What’s more the device would interact with thousands of other data-storage devices.
Douglas’s favourite example of what carrying such a device would mean was that you could be driving along a remote road in Texas, or Surrey for that matter, when the gadget would talk to you. It would “know” where it was from GPS, and it would also have a detailed profile of your domestic possessions and an intelligent model of your interests. Having recorded what you had done in the past, the device could infer what you wanted in the present. By interrogating some retailer’s inventory management system, or perhaps via one of the community of other users, it would alert you to the fact that the missing copy in your otherwise complete collection of rare Beatles’ bootlegs happened to be in a store in the next unlikely little town.*
192
Such a device would enhance your life not just as some kind of super-enhanced Filofax remembering things for you, but more like a benign Familiar Spirit providing selected information of help and relevance and—perhaps most interesting in a fragmented society in which some find mediated conversation less anxiety-inducing than the real thing—access to company. Another beauty of such a device is that the more people use it, the better it will be.
Mind you, when it came to the nuts and bolts of computing, there were two schools of thought about Douglas’s competence: his and the rest. Later, when The Digital Village was up and running, he was to try the patience of some of his very techie colleagues. His intuition about computers was second to none, but he was a compulsive fiddler and easily distracted. His lateral-thinking mind, more like a picture gallery with many branches than a debating chamber, was not fortified against the repetitiveness and shocking literal-mindedness needed for programming. He would make an intuitive leap of the “Ah-ha, I see what it’s doing, the little bastard” variety and then be stymied because there is no substitute for following the manual with an undeviating and deeply tiresome attention to detail. He liked to play with new software until he understood not just its functionality but its architecture, and confessed to enjoying the new avenues of displacement activity that computers had opened up to authors. The story he often told against himself was that he would happily spend two days programming a macro in order to save himself ten seconds when he opened a document.
From his earliest affair with the computer back in the eighties, Douglas had been tempted by the Apple. He was convinced of the superiority of the Macintosh operating system over that of the sadly ubiquitous PC, and felt that he could doodle on a Mac creatively, even write music, in a way that was impossible, or at least very tricky, on a PC. He was forever trying to wean his friends away from the frustrations of IBM-compatible kit and could bang on about it for a considerable time. The language of computer allegiance is surprisingly theological. Passionate believers in the Apple are always “evangelical,” and people in the IT world have been known just to have the description “evangelist” on their business cards in the certain expectation of being understood. The undecided—Jim Lynn, one of The Digital Village’s C++ programmers, for instance—call themselves agnostic. Douglas himself had an email exchange about the relative virtues of the Apple versus the PC with the well-regarded computer editor on the
Guardian,
Jack Schofield, which had all the heat of Jesuits arguing heresy.
Douglas relished a good rant, and one of his most entertaining polemics concerned Microsoft Windows. The gist was that Windows is a host of different software services (word-processing, spreadsheet, connectivity and so on) that had all been designed by separate teams and then exported to a foreign country where some real clever-clogs had constructed an overarching bridge onto which all the functions would fit. That’s why the final assembly is so complicated, and so many useful options are hidden away in drop-down menus that are not always the obvious ones. But, he said, Apple had not started that way. Their departure point had been what the user
actually wanted to do with the equipment.
Once Douglas met Bill Gates at a Paul Allen party (two of the billionaire founders of Microsoft and the IT revolution) and said to him: “You can’t run the world.” But they can—the software world at least.*
193
Douglas’s family was on the receiving end of his passion for Apple Macs, as part of his ever-present generosity. One Christmas they had gathered in Duncan Terrace—Little Jane, James, Sue, and Janet—for one of those blow-outs that leave you anchored to your seat for the rest of the day. But Douglas had a surprise for them. Each one was handed the end of a colour-coded string that they had to follow around the house, up the stairs, round corners, mischievously back again, until they came to the present at the end of it. And there, for each of them, was a brand new Apple iMac.
Ever since its first famous superbowl ad in 1984 (directed by Ridley Scott), Apple had always positioned itself as the computer for those who dared to be different. The idea was that by all means you could run your budget on a PC, but you should write your symphony on a Mac. In the nineties, as part of this continuing campaign, Kanwal Sharma, Apple’s inspired marketing man, had devised and managed a “great minds” scheme whereby about one hundred high profile celebrities (Apple dislikes the word celebrity and prefers “visionaries”) let their names be published as Apple users. Douglas was delighted to join.
What the “macophiles” in this scheme had in common was that they were leaders in their fields who worked creatively with their computers and found that they were liberated by them rather than (as is often the case with PCs) driven to rage and intemperate effing and blinding. The celebrities did not get paid, but they did get some of the latest equipment and software. From time to time they were brought together at Apple’s expense for extraordinary conferences where they could talk about anything to each other, and indulge in what the management books call “out of the box” thinking. In return they were expected to sit down occasionally and talk with the Apple team. The generic title for the scheme was the Apple Masters. Apple got input from some of the brightest people around and reinforced their role as the creative person’s computer company. The celebrities were flattered to be on a list of the charismatically brainy and interesting, and they saved a lot of money on equipment and got to go to some stimulating meetings. Douglas adored being an Apple Master. There is no doubt, however, that he preferred their kit and would have used it regardless.
The Apple Masters were a wonderfully eclectic collection of people. Mountaineers (Sir Chris Bonnington) could rub shoulders with actors like James Woods or Jennifer Jason Leigh, Richard Dawkins could chat with Richard Dreyfus, Peter Cochrane (the futurologist employed by BT) could swap ideas with Nobel Prize-winners Murray Gell-Mann or Donald Glaser (the physicist who invented the bubble chamber). Douglas was thrilled—where else could he encounter Harrison Ford or Muhammad Ali? It’s hard to imagine other circumstances in which a British comic SF writer would meet Sinbad, the African American stand-up comedian, and discover they had a rapport.
Douglas thought highly of Kanwal Sharma and they struck up a friendship. In Kanwal’s opinion, Douglas was one of the most creative people he had ever come across, able to juggle around many multidisciplinary subjects with enthusiasm and knowledge. He’d toy with ideas, said Kanwal, in a way that was quite childlike: “What would happen if I did this?” Kanwal recalls that Douglas had extraordinary presentational skills, and was brilliant with the Silicon Valley crowd. He remembers him on spectacular form in 1998 giving an address at a conference in San Jose about convergent technologies and the next generation of PDAs. Stop thinking in terms of tweaking existing gadgets, urged Douglas. Think instead of your P.E.T.—your Personal Electronic Thing—and what you would like it to do for you. Why would you want to use an ancient keyboard design to talk to it? Why not just talk? The audience of software engineers and technologists was delighted.
Though their relationship was largely telephonic, Douglas and Kanwal were close. Douglas’s phone bill—especially when he should have been chipping away at the word face of his current book—must have been considerable, for he had a circle of pals whom he would call frequently, sometimes every day, especially when he was in California for that last frustrating pass at the
Hitchhiker’s
movie. The phone was how he maintained his many friendships. If he and Douglas were both feeling low, Kanwal recalls, they would cheer themselves up by competing over just how miserable they were. Once Douglas called Kanwal in Hong Kong. After they compared notes about angst and jet lag, Douglas asked Kanwal how he had enjoyed life in seat 3A. Through his contacts, Douglas had wangled Kanwal an upgrade and continued to do so whenever he could.