Modern Mind: An Intellectual History of the 20th Century (113 page)

Read Modern Mind: An Intellectual History of the 20th Century Online

Authors: Peter Watson

Tags: #World History, #20th Century, #Retail, #Intellectual History, #History

BOOK: Modern Mind: An Intellectual History of the 20th Century
2.24Mb size Format: txt, pdf, ePub

Rodney Hilton,
professor of history at Birmingham University, was like the others a member of the British Communist Party until the events in Hungary in 1956. His main interest was in the precursors of the working class – the peasants – and besides his own books on the subject, he was instrumental in the founding of two journals in the 1960s, the
Journal of Peasant Studies
in Britain and
Peasant Studies
in the United States.
37
Hilton’s aim was to show that peasants were not a passive class in Britain in the Middle Ages; they did not just accept their status but were continually trying to improve it. There was, Hilton argued, constant struggle, as the peasants tried to gain more land for themselves or have their rents reduced or abolished.
38
This was no ‘golden
time,’ to use Harvey Kaye’s words, in his survey of the British group, when everyone was in his place and satisfied with it; instead there was always a form of peasant ‘class-consciousness’ that contributed to the eventual decline of the feudal-seigneurial regime in England.
39
This was a form of social evolution, Hilton’s point being that this struggle gave rise to agrarian capitalism, out of which industrial capitalism would emerge.
40

The next stage in the evolution was examined by Christopher Hill, fellow and Tutor of Balliol from 1938, who devoted himself to the study of the English revolution. His argument was that just as the peasants had struggled to obtain greater power in mediaeval times, so the English revolution, traditionally presented as a constitutional, religious, and political revolution, was in fact the culmination of a class struggle in which capitalist merchants and farmers sought to seize power from the feudal aristocracy and monarchy. In other words, the motivation for the revolution was primarily economic.
41
He put it this way: ‘The English revolution of 1640–60 was a great social movement like the French Revolution of 1789. The state power protecting an old order that was essentially feudal was violently overthrown, power passed into the hands of a new class [the bourgeoisie], and so the freer development of capitalism was made possible…. Furthermore, the Civil War was a class war, in which the despotism of Charles I was defended by the reactionary forces of the established Church and conservative landlords. Parliament beat the king because it could appeal to the enthusiastic support of the trading and industrial classes in town and countryside, to the yeomen and progressive gentry, and to wider masses of the population whenever they were able by free discussion to understand what the struggle was really about.’
42
He added that the revolution also took some of its colour from recent developments in science and technology, very practical concerns that could in time be converted into new commercial outlets.

Like Hilton and Hill, E. P. Thompson also left the British Communist Party in 1956. Like them he remained convinced that English history was determined mainly by class struggle. In a long book,
The Making of the English Working Class,
one of his aims was to ‘rescue’ the working classes from ‘the enormous condescension of posterity’ and render visible such neglected people as weavers and artisans. In the process, he redefined the working classes as essentially a matter of
experience.
It was the experience – between 1790 and 1830 – of a declining and weakening position in the world. This, he said, was the essence of the industrial revolution for the working class in England – the loss of common rights by the landless, the increasing poverty of many trades brought about by the deliberate manipulation of employment to make it more precarious.
43
Part of the attraction in Thompson’s book lies in the fact that it is so vividly written and humane, but it was original in a social Darwinian sense too. Before 1790 the English working classes existed in many disparate forms; the experience of oppression and the progressive loss of rights, far from resulting in their extinction, proved to be a major unifying (and therefore strengthening) force.
44

The final element in this ‘Great Leap Forward’ of historical studies came in 1973 from the archaeologist Colin Renfrew, in Britain. Like the
Annales
school
and the Marxists, and like archaeologists everywhere, he had an interest in
la longue durée.
But, again like the French and British historians, his main aim was less an obsession with dating as such as with a new understanding of history. Then professor at Southampton University, and now at Cambridge, his book was entitled
Before Civilisation: The Radiocarbon Revolution and Prehistoric Europe.
45
The title sold the book short, however, for Renfrew gave an admirably clear, albeit brief history of the development of dating in archaeology in the twentieth century and how – and this was his main point – it has changed our grasp of the past, not just in terms of chronology but in the way we conceive of man’s early development.

He began with a crisp statement of the problem, so far as archaeology was concerned. Various early-twentieth-century studies by geologists in Switzerland and Sweden had confirmed that the last ice age endured for 600,000 years and ended 10,000 years ago. The problem with ancient human history therefore stemmed from the fact that the written records stretched back no further than about 3,000 BC. What had happened between the end of the ice age and the birth of writing? Renfrew’s main aim was to take a look at archaeology in the wake of the various new dating mechanisms – dendrochronology, radiocarbon dating, and potassium-argon dating. Radiocarbon dating was first devised by
Willard
F.
Libby
in New York in 1949 (he won the Nobel Prize for Chemistry in 1960 for his innovation), but his insight was added to significantly
by
the
American Journal of Science,
which in 1959 started an annual radiocarbon supplement, which quickly became an independent publication,
Radiocarbon.
This provided an easily accessible forum for all the revised dates that were then being obtained from across the world. It was perhaps the biggest intrusion of science into a subject that had hitherto been regarded as an art or a humanity.

Before Civilisation
had two core arguments. First, it revised the timing of the way the earth was populated. For example, from about 1960 on it was known that Australia had been occupied by man as early as 4000
BC
, and maybe even as early as 17000
BC
. Maize, it was established, was gathered systematically in Mexico by 5000
BC
, and well before 3000
BC
it was showing signs of domestication. The real significance of these dates was not just that they were earlier than anyone had thought hitherto, but that they killed off the vague theories then current that Meso-America had only developed civilisation after it had been imported, in some indefinable way, from Europe. The Americas had been cut off from the rest of the world since 12000–13000
BC
, in effect the last ice age, and had developed all the hallmarks of civilisation – farming, building, metallurgy, religion – entirely separately.
46

This revision of chronology, and what it meant, was the second element in Renfrew’s book, and here he concentrated on the area he knew best, Europe and the classical world of the Middle East. In the traditional view, the civilisations of the Middle East – Sumer and Egypt, for example – were the mother civilisations, the first great collective achievements of mankind, giving rise to the Minoans on Crete, and the classical world of the Aegean: Athens, Mycenae, Troy. From there, civilisation had spread farther north, to the Balkans and then
Germany and Britain, and west to Italy and then France, and the Iberian peninsula. But after the C
14
revolution, there was suddenly a serious problem with this model.
47
On the new dating, the huge megalithic sites of the Atlantic seaboard, in Spain and Portugal, in Brittany and Britain, and in Denmark, were either contemporaneous with the civilisations of the Aegean or actually preceded them. This wasn’t just a question of an isolated date here and there but of many hundreds of revised datings, consistent with each other, and which in some cases put the Atlantic megaliths up to a thousand years earlier than Aegean cultures. The traditional model, for Egypt, the Middle East, and the Aegean, still held. But there was, as Renfrew put it, a sort of archaeological ‘fault line’ around the Aegean. Beyond that, a new model was needed.

The model he came up with started from a rejection of the old idea of
‘diffusion’
– that there had been an area of mother civilisations in the Middle East from which ideas of farming, metallurgy, and, say, the domestication of plants and animals had started, and then spread to all other areas as people migrated. It seemed clear to Renfrew that up and down the Atlantic coasts of Europe, there had developed a series of chiefdoms, a level of social organisation midway between hunter-gatherers and full-blown civilisation as represented in Egypt, Sumer, or Crete, which had kings, elaborate palaces, a highly stratified society. The sovereign areas of the chiefdoms were smaller (six on the Isle of Arran in Scotland, for example), and they were centred around large tombs and occasionally religious/astronomical sites, such as Stonehenge.
48
Associated with these chiefdoms were a rudimentary social stratification and early trade. Sufficient numbers were needed to build the impressive stone works, funerary religious monuments around which the clans cohered. The megaliths were always found associated with arable land, suggesting that chiefdoms were a natural stage in the evolution of society: when man settled with the first domesticated crops, chiefdoms and megaliths soon followed.
49

Renfrew’s analysis, now generally accepted, concentrated on sites in Britain, Spain, and the Balkans, which illustrated his argument. But it was his general thrust that counted: although early man had no doubt spread out to populate the globe from an initial point (maybe East Africa), civilisation, culture – call it what you will – had not developed in one place and then spread in the same way; civilisations had grown up in different times at different places of their own accord.
50
This had two important long-term intellectual consequences, quite apart from killing off any idea (which still lingered then) that American civilisations had been seeded by European ones (such as the ‘lost tribe’ of Israel). First, it demolished the idea that, culturally speaking, the history of man is one story; all cultures across the world were
sui generis
and did not owe their being to a mother culture, the ancestor of all. Combined with the findings of anthropologists, this made all cultures equally potent and equally original, and therefore the ‘classical’ world was no longer the ultimate source.

At a deeper level, as Renfrew specifically pointed out, the discoveries of the new archaeology showed the dangers of succumbing too easily to Darwinian thinking.
51
The old diffusionist theory was a form of evolution, but a form of evolution so general as to be almost meaningless. It suggested that civilisations
developed in an unbroken, single sequence. The new C
14
and tree-ring evidence showed that that simply wasn’t true. The new view wasn’t any less ‘evolutionary,’ but it was very different. It was, above all, a cautionary tale.

32
HEAVEN AND EARTH
 

The words
historic moment
have been heavily overused in this century. But if any moment outside war can be described as truly historic it surely occurred at twenty seconds after 3:56
AM
BST on Monday, 21 July 1969, when
Neil Armstrong
stepped off the ladder that led from the ‘Eagle,’ the landing module of the
Apollo 11
spacecraft, and on to the surface of the Moon, making him the first person to arrive on a celestial body outside Earth. As he did so, he spoke the words that have since become famous: ‘That’s one small step for man – one giant leap for mankind.
1

For the benefit of scientists back at Mission Control in Houston, he then went on in a more down-to-earth, scientifically informative way: ‘The surface is fine and powdery, I can … I can pick it up loosely with my toe. It does adhere in fine layers like powdered charcoal to the sole and sides of my boots. I can only go in a small fraction of an inch. Maybe an eighth of an inch, but I can see the footprints of my boots and the treads in the fine sandy particles…. There seems to be no difficulty in moving around, as we suspected…. We’re essentially on a level place here – very level place here.’
2
If the greatest intellectual achievement of the first half of the twentieth century was undeniable – the conception and construction of the atomic bomb – the achievements of the second half were more diverse, including the isolation and understanding of DNA, and the computer. But space travel and the moon landing certainly count as one of the greatest achievements of the century.

After the Russians had sprung their surprise in 1957, stealing a march on the United States with the launch
of Sputnik 1,
they had built on their lead, putting the first animal in space, the first man
(Yuri Gagarin,
in 1961), and the first woman
(Valentina Tereshkova,
in 1963). The United States responded with something close to panic. President Kennedy called a frenzied meeting at the White House four days after the Bay of Pigs disaster (when 1,500 Cuban exiles, trained in America by the U.S. military, invaded the island, only to be killed or captured). Railing that ‘something must be done,’ he had shouted at Lyndon Johnson, the vice president, ordering him to find out if ‘we have a chance of beating the Soviets by putting a laboratory in space, or by a trip round the Moon, or by a rocket to land on the Moon, or by a rocket to go to the Moon and back with a man?’
3
The Americans finally put
John Glenn
into orbit on
20 February 1962 (Alan Shephard made a fifteen-minute nonorbital flight in May 1961). But then the Americans began to catch up, thanks to Kennedy’s commitment to the Apollo program with its aim to land a manned spacecraft on the Moon ‘before this decade is out.’
4
Begun in 1963 (though NASA had been created in 1958), America spent up to $5 billion a year on space over the next ten years. That sum gives an idea of the size of the project, which involved, among other things, building a reliable spaceship bigger than a railway engine, designing and manufacturing a rocket heavier than a destroyer, and inventing several new materials.
5
The project benefited from the brains of 400,000 people from 150 universities and 20,000 firms. We already know from Korolev that rocket technology lay at the heart of the space program and the biggest U.S. rocket,
Saturn 5,
weighed 2,700 tons, roughly the same as 350 London buses. Developed under Wernher von Braun, another German emigré,
Saturn
was 364 feet high, had 2 million working parts, 2.5 million solder joints, and 41 separate engines for guidance purposes, and carried in all 11,400,000 gallons of fuel – liquid nitrogen, oxygen, hydrogen, and helium, some of it stored at minus 221 degrees centigrade to keep it liquid.
6
The oxygen alone filled the equivalent of fifty-four railway container tanks.
7
The Moonship contained the cone-shaped command module, the only part to come back to earth and which therefore needed to withstand the enormous temperatures on re-entry to the atmosphere (caused by friction at such high speeds).
8
One of the main engineering problems was to keep the cryogenic fuels cool enough. The tanks eventually designed were so airtight that, were ice cubes to have been installed inside, they would not have melted for nine years. The exit hatch of the module needed 150 new tools to be invented. Some bolts had to be locked in place by two men using a five-foot wrench.
9

No one really knew how conditions in space would affect the men.
10
Great care was taken therefore with psychological selection and training. They were taught to be tolerant and careful (always avoiding acute angles where they might snag their suits), and they were given massages every day. The crews that advanced were those that had worked together in harmony for more than a year. Interestingly enough, over the years both the Americans and the Russians came up with a near-identical profile of the ideal astronaut: they should not be too old, no later than their late thirties, or too tall, no taller than five-eleven or six feet; they should be qualified jet and test pilots with degrees in engineering.
11
Finally, there was the reconnaissance of the Moon itself. Quite apart from the prospect in time of colonising space and its minerals, there were sound scientific reasons for studying the Moon close up. Since it lacked an atmosphere, the Moon was in some senses in pristine condition, ‘a priceless antique,’ as one scientist called it, in much the same condition as it had been when the universe, or the solar system at least, had first evolved. Examination of the rocks would also help decide how the Moon formed – whether it was once part of Earth, or broke off with Earth from the sun after collision with an asteroid, or was formed by very hot gas cooling.
12
Both American and Soviet space probes got progressively closer to the Moon, sending back better and better photographs until objects as small as five feet wide could be distinguished. Five areas were
originally chosen for landing, then narrowed to one, the Sea of Tranquillity, actually a flat plain free of craters.
13

The biggest disaster of the American program took place in 1967, when a spaceship caught fire on the launchpad at Cape Kennedy after liquid oxygen somehow ignited, killing all three men inside. The world never knew how many Russian astronauts perished because of the greater secrecy surrounding their program, but distress messages picked up by radio hams around the globe suggested that at least eight were in trouble between 1962 and 1967.
14
The greatest drama before the moon landing itself was the December 1968 flight
of Apollo 8
around the Moon, which involved going behind the Moon to the dark side, which no one had ever seen, and meant that the crew would be out of radio contact with Mission Control for about half an hour. If the ‘burn’ of the engines was too strong, it might veer off into deep space; if it was too weak, it might crash into the Moon, on the far side, never to be heard from again.
15
The pope sent a message of goodwill, as did a number of Russian space scientists, acknowledging implicitly at this point that the Americans were now decisively ahead.

At 9:59
AM
on Christmas Eve,
Apollo 8
swung behind the Moon. Mission Control in Houston, and the rest of the world, waited. Ten minutes of silence passed; twenty; thirty. At 10:39
AM
Frank Borman’s voice could be heard reporting data from his instruments.
Apollo 8
was exactly on schedule and, as Peter Fairley narrates the episode in his history of the Apollo Project, after a journey of a quarter of a million miles, it had arrived on its trajectory within half a mile of the one planned.
16

The scene was set for
Apollo 11.
Edward ‘Buzz’ Aldrin Jr. joined Neil Armstrong on the surface of the Moon, where they placed a plaque, and a flag, planted some seeds, and collected rock samples with specially designed tools that avoided them having to bend. Then it was back into the ‘Lunar Bug,’ rendezvous with Michael Collins in the command module, and the return journey, splashing down near Johnston Island in the Pacific, where they were met by the USS
Hornet
with President Richard Nixon on board. The men had returned safely to Earth, and the space age had begun.
17

The landing on the Moon was, however, in some ways a climax rather than a debut. Crewed flights to the Moon continued until 1972, but then stopped. As the 1970s wore on, space probes went deeper into the heavens – Venus, Mars, Mercury, Jupiter, the Sun, Saturn, with
Pioneer 10,
launched in 1972, becoming the first manmade object to leave the solar system, which it did in 1983. Actual landings were considered less necessary after the first flush of excitement, and both the Americans and Russians concentrated on longer flights in orbit, to enable scientists to carry out experiments in space: in 1973, in the United States
Skylab,
astronauts spent eighty-four days aboard. The first stage of the space age may be said to have matured around 1980. In that year,
Intelsat 5
was launched, capable of relaying thousands of telephone calls and two television channels. And in the following year the
Columbia,
the first reusable space shuttle, was launched. In just ten years space travel had gone from being exotic to being almost mundane.

*

The space race naturally stimulated interest in the heavens in general, a happy coincidence, as the 1960s had in any case seen some very important advances in our understanding of the universe, even without the advantages conferred by satellite technology. In the first half of the century, apart from the development of the atomic bomb and relativity, the main achievement of physics was its unification with chemistry (as epitomised in the work of Linus Pauling). After the war, the discovery of yet more fundamental particles, especially quarks, brought about an equivalent unification, between physics and astronomy. The result of this consilience, as it would be called, was a much more complete explanation of how the heavens – the universe – began and evolved. It was, for those who do not find the reference blasphemous, an alternative Genesis.

Quarks, as we have seen, were originally proposed by Murray Gell-Mann and George Zweig, almost simultaneously in 1962. It is important to grasp that quarks do not exist in isolation in nature (at least on Earth), but the significance of the quark (and certain other particles isolated in the 1960s and 1970s but which we need not describe here) is that it helps explain conditions in the early moments of the universe, just after the Big Bang. The idea that the universe began at a finite moment in the past was accepted by most physicists, and many others, since Hubble’s discovery of the red shift in 1929, but the 1960s saw renewed interest in the topic, partly as a result of Gell-Mann’s theories about the quark but also because of an accidental discovery made at the Bell Telephone Laboratories in New Jersey, in 1965.

Since 1964, the Bell Labs had been in possession of a new kind of telescope. An antenna located on Crawford Hill at Holmdel communicated with the skies via the
Echo
satellite. This meant the telescope was able to ‘see’ into space without the distorting interference of the atmosphere, and that far more of the skies were accessible. As their first experiment, the scientists in charge of the telescope, Arno Penzias and Robert Wilson, decided to study the radio waves being emitted from our own galaxy. This was essentially baseline research, the idea being that once they knew what pattern of radio waves
we
were emitting, it would be easier to study similar waves coming from elsewhere. Except that it wasn’t that simple. Wherever they looked in the sky, Penzias and Wilson found a persistent source of interference – like static. At first they had thought there was something wrong with their instruments. A pair of pigeons were nesting in the antenna, with the predictable result that there were droppings everywhere. The birds were captured and sent to another part of the Bell complex. They came back. This time, according to Steven Weinberg’s account published later, they were dealt with ‘by more decisive means.
18
With the antenna cleaned up, the ‘static’ was reduced, but only minimally, and it still appeared from all directions. Penzias discussed his mystery with another radio astronomer at MIT, Bernard Burke. Burke recalled that a colleague of his, Ken Turner of the Carnegie Institute of Technology, had mentioned a talk he had heard at Johns Hopkins University in Baltimore given by a young theorist from Princeton, P. J. E. Peebles, which might bear on the ‘static’ mystery. Peebles’s speciality was the early universe. This was a relatively new discipline and still very speculative. As we saw in chapter 29, in the 1940s an emigré from
Ukraine, George Gamow, had begun to think about applying the new particle physics to the conditions that must have existed at the time of the Big Bang. He started with ‘primordial hydrogen,’ which, he said, would have been partly converted into helium, though the amount produced would have depended on the temperature of the Big Bang. He also said that the hot radiation corresponding to the enormous fireball would have thinned out and cooled as the universe expanded. He went on to argue that this radiation ‘should still exist, in a highly “red-shifted” form, as radio waves.
19
This idea of
‘relict radiation’
was taken up by others, some of whom calculated that such radiation should now have a temperature of 5 K (i.e., 5 degrees above absolute zero). Curiously, with physics and astronomy only just beginning to come together, no physicist appeared to be aware that even then radio astronomy was far enough ahead to answer that question. So the experiment was never done. And when radio astronomers at Princeton, under
Robert Dicke,
began examining the skies for radiation, they never looked for the coolest kinds, not being aware of their significance. It was a classic case of the right hand not knowing what the left was doing. When Peebles, a Canadian from Winnipeg, started his Ph.D. at Princeton in the late 1950s, he worked under Robert Dicke. Gamow’s theories had been forgotten but, more to the point, Dicke himself seems to have forgotten his own earlier work.
20
The result was that Peebles unknowingly repeated all the experiments and theorising of those who had gone before. He arrived at the same conclusion, that the universe should now be filled with
‘a
sea of background radiation’ with a temperature of only a few K. Dicke, who either still failed to remember his earlier experiments or didn’t realise their significance, liked Peebles’s reasoning enough to suggest that they build a small radio telescope to look for the background radiation.

Other books

Yours Always by Rhonda Dennis
Midnight's Seduction by Donna Grant
Surrender Becomes Her by Shirlee Busbee
How to Piss in Public by McInnes, Gavin
An Annie Dillard Reader by Annie Dillard
Why Is Milk White? by Alexa Coelho
Zemindar by Valerie Fitzgerald