American Experiment (382 page)

Read American Experiment Online

Authors: James MacGregor Burns

BOOK: American Experiment
9.07Mb size Format: txt, pdf, ePub

Russians and Americans differed even more sharply over individualistic political rights versus collective socioeconomic freedoms. When Washington accused Moscow of violating personal political rights in its treatment of dissidents, the Kremlin gleefully retaliated by accusing Washington of violating the social and economic rights of the poor in general and blacks in particular. The Carter Administration sought to deflect such ripostes by emphasizing that human rights encompassed economic and social rights as well as political and civil liberties. “We recognize that people have economic as well as political rights,” Secretary of State Vance said in 1978.

Still the debate continued, and rose to new heights during the Reagan Administration as conservative ideologues found official rostrums from which to belabor Soviet repression, while Soviet propagandists found ample material for exploitation in stories in American journals and newspapers about the poor and the homeless.

At the dawn of the last decade of the second millennium
A.D.,
as Westerners prepared to celebrate the bicentennials of the French Declaration of the Rights of Man and the American Bill of Rights, human rights as a code of international and internal behavior—especially as embodied in the UN declaration of 1948—were in practical and philosophical disarray. Rival states used the Universal Declaration to wage forensic wars with one another over the fundamental meaning of freedom. It had proved impossible for national leaders to agree on priorities and linkages among competing rights, most notably between economic-social and civil-political.

And yet the world Declaration of Rights still stood as a guide to right conduct and a symbol of global aspiration. In both domestic and international politics it was invoked, on occasion, with good effect. As cast into international instruments, human rights law, David Forsythe concluded, “is an important factor in the mobilization of concerned individuals and groups who desire more freedom, or more socio-economic justice, or both. This mobilization has occurred everywhere, even in totalitarian and authoritarian societies.” And the conflict over the meaning and application of international human rights invited the tribute of hypocrisy. “The clearest evidence of the stability of our values over time,” writes Michael Walzer, “is the unchanging character of the lies soldiers and statesmen tell. They lie in order to justify themselves, and so they describe for us the lineaments of justice. Wherever we find hypocrisy, we also find moral knowledge.” Thus the idea of freedom and justice and human rights binds the virtuous and the less virtuous together, in hypocrisy and in hope.

CHAPTER 13
The Culture of the Workshop

O
UT OF THE PRODIGIOUS
experimental workshops of America—the “enormous laboratories,” the solitary one-person think tanks, the suspense-ridden observatories, the bustling engineering departments— erupted a fecundity of ideas, discoveries, and inventions during the 1960s and 1970s. Measured by Nobel Prizes, United States scientists still dominated the life sciences and physics and, to a lesser degree, chemistry. In physiology or medicine, Americans during these two decades won Nobels for discoveries about hearing, cholesterol metabolism, hormonal treatment of prostate cancer, color perception, enzymes and the growth and functioning of cells, the relation between cancer-causing viruses and genes, the origin and spread of infectious diseases, the action of hormones, the effect of restriction enzymes on genes, the CAT-scan X-ray procedure, cell immunology. In physiology and medicine, Americans, sometimes in partnership with British and continental scientists, won Nobels for seven years straight starting in 1974.

In physics, during these two decades Americans were awarded Nobel Prizes for work on the complex structure of protons and nucleons, the symmetry principle governing the interaction of nuclear particles, masers, quantum electrodynamics, the nuclear reactions involved in the production of energy in stars, superconductivity, the subatomic J/psi particle, solid-state electronics, background cosmic radiation supporting the “big bang” theory, the symmetry of subatomic particles. In chemistry, Americans won Nobel Prizes for work on the chemical reactions in photosynthesis, the synthesis of organic structures, the theory of chemical bonds holding atoms together in molecules, the reciprocal processes in interactions such as those of voltage and temperature in heat transfer, the molecular analyses of proteins and enzymes, macromolecules, the methods of diagramming the structure and function of DNA.

The Nobel committee does not grant prizes in astronomy or astrophysics, but scientists in the nation’s observatories hardly needed kudos from Stockholm for incentives. Knowledge of outer space expanded immensely as more and more powerful radio telescopes were built. In 1960, quasars
(“quasi-stellar radio sources”) were detected at Palomar Observatory in southern California. Within a few years, astronomers were discovering that the fastest and most distant of these faint blue celestial bodies, quasar 3C-9, was speeding away from the Milky Way at almost the speed of light. In 1975, University of California observers found a new galaxy at least ten times larger than the Milky Way and around eight billion light-years away from Earth. “Black holes” were probed, including a possible one at the center of the Milky Way. During the 1960s and 1970s, the space program was bringing breathtaking information about the planets and other nearby precincts. But nothing stirred earthlings’ imaginations as much as the radio waves coming in from galaxies quadrillions of miles away.

The dependence of astronomy on radio telescopes and space programs indicated the vital role of technology in scientific work. In atomic research, the 1930 cyclotron that accelerated particles to energies of a few million volts for smashing atoms was followed thirty years later by larger atom smashers at Brookhaven National Laboratory in New York and by Stanford University’s two-mile, 20-billion-volt linear accelerator. Oceanography required large vessels with highly specialized equipment. The development of the atomic and hydrogen bombs had taken huge facilities and vast sums of money. The dependence of science on technology was not new in America—only its magnitude. Even while Josiah Gibbs had been working solo with meager facilities at Yale, Thomas Edison had been establishing an elaborately equipped industrial research laboratory. While Edison had enjoyed calling it his “invention factory,” the German economist Werner Sombart glimpsed the new world ahead when he noted that, in the United States, Edison had made a “business” of invention. The increasing dependence of scientific research on big technology raised a number of political and intellectual problems, such as the controls implied in both corporate and governmental subsidies, excessive influence on scientific research of market forces, and discouragement of “free” scientific inquiry.

Volumes—indeed, libraries of volumes—were written about scientific discoveries and technological breakthroughs of Nobel-class stature. Was there a pattern in these developments? The growth and impact of semiconductor electronics illustrated, perhaps more dramatically than most inventions, the nature of the origin and application of scientific ideas in the twentieth century. The intellectual roots of the electronics industry went back to earlier scientific geniuses, in this case Faraday and Maxwell; the next steps were the product of fundamental, multidisciplinary academic research, which both contributed to and was greatly stimulated by the two world wars, especially the second; further development as usual turned on 90 percent perspiration, 10 percent inspiration, on the part of a large
number of persons. Soon semiconductor research, like that producing other great innovations, moved far beyond the capacities of the kind of Yankee tinkerer that had pioneered American technology in earlier decades. There was no way, Ernest Braun and Stuart Macdonald wrote, in which semiconductor devices could be “tinkered with at home; no way by which a skilled craftsman could improve their performance. The mode of operation of these devices is so complex and intricate, the scale so small, the interactions so subtle, that all old-fashioned inventiveness proved of no avail.”

The semiconductor breakthroughs fell upon American industry in the form of computers capable of storing and processing fantastic quantities of data and of large-scale automation, on American consumers in glittering arrays of electronic calculators, digital watches, portable tape recorders, heart pacemakers, hearing aids. An estimated 190 million integrated circuits were installed in 1980-model American cars to monitor engine efficiency and produce the increasingly elaborate dashboard displays. Entering the inner world of miniaturization was as extraordinary as exploring the universe of space. Americans developed microchips the size of a postage stamp with half a million or so transistors and associated circuitry mounted on each. These chips liberated millions of Americans from old-time chores like “figurin’ ” with pencil and paper; they also made bugging devices more efficient and easily concealed. Americans were moving so fast through the electronics age that the enlargement and diminution of personal freedom could hardly be calculated.

The Dicing Game of Science

Just as the rise of American capitalism had made heroes of captains of industry like Carnegie and Ford, and two world wars had created paladins like Pershing and Eisenhower, so the outpouring from the prodigious laboratories of the 1950s and 1960s produced heroes of science. Jonas E. Salk of the University of Pittsburgh Medical School, using three strains of inactivated polio, developed a vaccine that was injected into almost two million schoolchildren, with gratifying results. The frightening rise of polio cases from three or four thousand a year in the late 1930s to around 60,000 by 1952 was reversed to its levels of the 1930s. This benign curve was completed by Albert Sabin, whose widely used oral vaccine brought polio cases reported in the United States down to twenty-one in 1971. Linus Pauling, after receiving the Nobel Prize in chemistry for his research into the forces holding proteins and molecules together, became the first man to win a second unshared Nobel, the peace prize of 1962, for his
leadership in the campaign against nuclear testing. James Watson achieved fame not only for his role in the discovery of the structure of DNA— suddenly “we knew that a new world had been opened”—but for writing about it in a book,
The Double Helix,
that revealed the amazing mixture of boldness and caution, planning and chance, cooperation and rank competitiveness, intensive analysis and experimental derring-do that defined the world of the top scientists.

Yet none of these scientific celebrities could compare—in scientific eminence, intellectual genius, influence on scientific thought, and impact on public attitudes toward scientists—with another American, Albert Einstein. Having fled from Nazism to the United States in 1932 and become an American citizen in 1940, the German-born scientist was proudly accepted by his adopted fellow countrymen as one of their own.

By the time of Einstein’s death in 1955, aspiring young scientists knew much about his life, a kind of Horatio Alger saga, European style—how he grew up in a middle-class Jewish family in heavily Catholic Munich, learned to speak so late that his parents feared he was subnormal, received mediocre schooling, moved to Milan with his family when his father’s business failed (for the second time), then struck out on his own for Zurich in order to attend the highly regarded Swiss Federal Polytechnic School. Frustrated young scientists drew solace from reading about young Einstein’s ups and downs: a cheeky scholar who was rebuffed by his teachers, an iconoclast who rebutted conventional scientific wisdom, he spent years in hand-to-mouth employment until he got a low-level job in the Swiss Patent Office. While there he conducted intensive study on his own time and often on the government’s.

Suddenly, from the mind of this twenty-six-year-old bureaucrat, who had neither classroom nor laboratory, neither students nor apparatus, came in 1905 a series of sensational papers. Two of these—“On the Motion of Small Particles Suspended in a Stationary Liquid According to the Molecular Kinetic Theory of Induction” and “On a Heuristic Viewpoint Concerning the Production and Transformation of Light”—exemplified not only Einstein’s scientific versatility but his genius in both methodology and conceptualization. The third paper, presenting the special theory of relativity—on the mass-energy equivalence,
E = mc
2
—produced the most fundamental change in popular scientific thought about space and time in the three centuries since Newton.

The young scientist who had been denied decent teaching posts only a few years before now won a series of professorships at Zurich, at Prague, and at the newly established Kaiser Wilhelm Institute for Physics at the prestigious University of Berlin. Deeply absorbed in new concepts of
relativity, Einstein in 1916 published his general theory, which challenged conventional Newtonian views of gravity. When a British solar eclipse expedition in 1919 confirmed, through photographs, the prediction that the general theory had made of the extent of starlight’s gravitational deflection at the edge of the sun, the forty-year-old scientist in Berlin “awoke to find himself famous.”
The Times
of London, in a leading article, opined that nearly all that had “been accepted as the axiomatic basis of physical thought” must now be swept away.

Einstein’s rise to world fame coincided in 1918-19 with the collapse of the German monarchy in defeat, starvation, and revolution. The solitary scientist who had shunned politics had joined an antiwar political party that was banned in 1916, signed a petition asking the heads of state about to meet in Versailles to “make a peace that does not conceal a future war,” and later worked for world intellectual cooperation under the League of Nations and enlisted in the Zionist cause. Though lampooned as a “guileless child” for supporting pacifism and internationalism, he was politically astute enough to sense the menace of Nazism and make his escape from Germany just before Hitler took power.

At the Institute for Advanced Study in Princeton, Einstein continued his search for a unified field theory, running into formidable mathematical and conceptual problems. But he did not shun political affairs in his adopted country. And in August 1939, as Europe was about to be engulfed in war once again, Einstein signed the famous letter, prepared by fellow refugee-scientist Leo Szilard, warning Roosevelt that research performed in “the course of the last four months” had made it appear possible in the near future “to set up nuclear chain reactions in a large mass of uranium, by which vast amounts of power and large quantities of new radium-like elements would be generated,” and that this could lead to the construction of bombs so powerful that a single one could destroy a port and the city around it.

Other books

Lawless by Cindy Stark
The Queen's Margarine by Wendy Perriam
Jane Ashford by Man of Honour