Absolute Zero and the Conquest of Cold (31 page)

BOOK: Absolute Zero and the Conquest of Cold
7.86Mb size Format: txt, pdf, ePub

Days after the Japanese attacked Pearl Harbor, in December 1941, meatpackers in Chicago boned, quick-froze, and shipped to the Pacific a million pounds of beef, chicken, and pork. This demonstration intensified the military's determination to feed frozen foods to personnel in faraway theaters of combat. The sheer quantity of food required by the military created a greater need to avoid wasting crops or livestock. Moreover, because so much fresh food went to the military, there was higher civilian demand for frozen fruits, vegetables, meats, and fish. A last and unexpected spur to the frozen-foods industry was also due to the war: since many automobile showrooms stood empty because they lacked product, their owners sought other uses for their space, and a significant number of the Showrooms became frozen-food distribution centers.

Enlarged military bases in areas that had previously been sparsely populated, such as the deserts of the Southwest and the near-tropical areas of the Southeast, started to attract civilian populations to serve them, increasing the demand for air conditioning. In the dry regions, the technique of "evaporative cooling" became fashionable, partly because it was cheap and easy to operate: a fan blew dampened air through a home, and the dry air in the house absorbed the humidity, cooling the rooms. The availability of evaporative cooling made cities such as Phoenix more habitable, and their populations grew rapidly. Humid areas such as Florida required more expensive air-cooling techniques; despite the cost, new migrants to the Southeast gravitated to developments featuring homes with central air conditioning already installed, as life in the South began to seem unimaginable without artificially cooled air.

The explosive growth of America's suburbs in the postwar era brought with it spectacular increases in the use of frozen foods, refrigeration, air conditioning, and liquified gases. A spacious, air-conditioned supermarket selling a wide variety of frozen foods helped attract people to the suburbs; the annual consumption of frozen foods leapt toward 50 pounds per person, With two hundred new products introduced each year. Although only one out of every eight homes in the United States had air conditioning in the 1960s, four out of ten in the Sunbelt region had it. Nearly every American home featured a refrigerator, and many newer homes, more than one. Greater use of the cold became inextricably associated with America's advancing standard of living—an index of material comfort that expresses the degree to which people control their environment. And as the American standard of living rose, refrigeration, air conditioning, frozen foods, and other products made with cold technology no longer were luxuries but were judged necessities of modern life.

Meanwhile, what had once taken Onnes years, cumbersome equipment, and considerable expense to produce—a few liters of liquefied helium—by the 1950s could be done routinely in almost any laboratory, after the invention by Sam Collins of MIT of a liquefier no bulkier than a home appliance. Shortly, the technique was extended to commercial manufacture of liquid helium. New or improved uses for many cold-liquefied gases emerged: the employment of liquid nitrogen to store blood and semen, the use of liquid hydrogen and liquid oxygen in the rockets of space-exploration programs, liquefied-gas coolants and scrubbers for nuclear reactors, liquid-helium traps for interstellar particles. Artificial insemination of dairy herds became more widespread as the ability to store semen advanced. Miniature Linde-Hampson systems with Joule-Thomson expansion nozzles made possible more portable liquid-cooled infrared sensors; these systems were utilized mostly by the military—for "smart" missiles, projectiles, and night-vision systems—but they were also useful in detecting fires and, in medicine, for detecting diseased tissue. Surgical technique was advanced by the introduction of cryosurgery; a probe carrying liquid nitrogen into the brain or the prostate could do what a metallic knife could not: first tentatively cool a section of tissue, allowing the surgeon the latitude of evaluating the probable results of surgery before deciding whether to permanently destroy the target area.

As space shots reached beyond Earth's atmosphere, boosted into orbit by combinations of hydrogen and oxygen fuel stored in ultracold liquid form in the immense rockets, scientists received the first experimental verification that the temperature of interstellar space was within a few degrees Kelvin of absolute zero. Shortly, liquid oxygen and liquid hydrogen provided the fuel to send men to the moon and return them to Earth, and cold-control apparatus permitted them to stay alive on the journey.

In the universe described by Newtonian physics, nothing smaller than a cannonball shot at a solid wall with great force could push through that wall; the force and mass of the cannonball overcame the energy of the atoms within the wall, which in all other cases was great enough to repulse the energy of the atoms attempting to penetrate it. In the universe described by quantum physics, subatomic particles can also sometimes pass such a barrier, by a process known as tunneling, in which the particles do not overcome the energy of the atoms in their way but instead create a path between the atoms of the wall. In the wake of the revelation of the B-C-S theory in 1957, two scientists half a world apart made discoveries about subatomic tunneling related to superconductivity. In Tokyo, a graduate student in physics employed by the Sony Corporation, Leo Esaki, described tunneling effects in semiconductors at low temperatures, and made what are now called Esaki diodes. In Schenectady, New York, the Norwegian-born graduate student Ivar Giaever was working for General Electric and taking a course at Rensselaer Polytech when he realized that the tunneling of electrons might be used to measure the energy-band gap long ago identified by Fritz London as existing at the Fermi surface in a superconductor. On April 22, 1960, he made a metal sandwich, a layer of insulation between two thin plates of metals; when the outside layers were in the normal state, electrons in a current could tunnel through the insulation, but when one of the layers was in the superconducting state, no tunneling occurred. The effect was measurable, but it had not yet been explained.

A third graduate student, Brian Josephson, at Cambridge University in 1962, drew on the insights of Esaki and Giaever and on Cooper's work with electron pairs—the linchpin of the B-C-S theory—to predict that Cooper pairs could tunnel through the metal sandwich, even when both outer layers were superconductors. Also, this tunneling current proved to be extremely sensitive to external voltages and currents. If the voltage in the current coursing through the layers suddenly fell below a certain level, or if a magnetic field disturbed the superconductivity, the change in the voltage would be measurable. Josephson's teacher at Cambridge, Philip Anderson, took this idea back to Bell Labs, where he and associates were able to prove it experimentally. Several companies began production of "Josephson junctions" to record or measure minute changes in electrical voltage and magnetic fields. These junctions became the heart of new gadgets known as SQUIDs (superconducting quantum in terference devices). SQUIDs are used in voltmeters for low-temperature experiments, including those on space satellites; in magnetometers sensitive enough to pick up the magnetic field of a passing submarine, a human brain, or a heart, or even of a single neuron; and in making speedy logic elements and memory cells in computers. An experimental Josephson junction computer was constructed in the 1980s, with parts of the works sitting in special ultracool liquids, a machine 100 times faster than the usual computers. Indicating the importance that scientists attached to the ideas of Esaki, Giaever, and Josephson, they were jointly awarded the Nobel Prize for 1973.

Josephson junction innovations did not garner much public attention, which went instead to the yearly, breathtaking improvements in microchips and semiconductors. Nor did the public realize that as electronic technologies advanced, their need for cooling became greater. Supersonic-aircraft flight speeds had brought to the fore the problems of cooling the electronic gadgetry necessary to operate them, to reduce the size of the devices and improve their reliability. Phalanxes of engineers began devoting themselves to a new specialty, the technology of removing heat from all sorts of electronic equipment. Back in 1947, the first chips contained one component each; by the 1970s, a chip could hold close to 100,000 components, including transistors, diodes, resistors, and capacitors, and there was a need to cool such chips while they were operating and also during their manufacture, often to cryogenic levels. According to a textbook, the "dramatic" increase in miniaturization mandated that "thermal considerations ... must be introduced at the earliest stages of design." The more complex the electronic gadget—the computer, the television set, the mobile telephone—the more likely it was to contain parts manufactured at temperatures hundreds of degrees below freezing. Cooling electronic machinery during manufacture and performance became the fifth large cold-based industry of the modern era.

The need for cooling renewed interest in thermoelectricity, the effect discovered by Peltier in 1834 and more fully explored by Kelvin in 1854, in which cold can be generated by electric currents flowing across two conductors made of different materials. Twentieth-century research revealed that the thermoelectric powers of semiconductors are much greater than those of metals and that thermoelectric capacities can be expanded at low temperatures (80 to 160 K) by the application of a magnetic field. New semiconducting materials made from compounds never known before the last few decades are now used to create very small refrigerating devices for computer components and other sensitive electronic circuitry, including miniature lasers.

The ultimate point in miniaturization and cooling may have been reached by the Baykov Institute of Metallurgy in Moscow and the Odessa Institute of Refrigeration, which in the 1990s created thermoelectric coolers on single crystals made from solid solutions of bismuth and antimony compounds. Single-crystal coolers are being used experimentally for infrared detectors, light-emitting diodes and lasers, and devices for night viewing, astronomical observations, ground and space optic communication, missile guidance, and target illumination.

In January 1962 Lev Landau was involved in a car crash in Russia that broke nearly every bone in his body and put him into a coma for fifty-seven days; perhaps goaded by his being near death, the Nobel committee awarded him the Nobel Prize in physics in December 1962 for his decades-old work on the theory of condensed matter and superfluidity. Soviet scientists were chagrined that Landau would be honored without the prize being jointly awarded to Pyotr Kapitsa, who had done the basic experimental work on superfluidity, but they hoped Kapitsa's turn would come. It took another sixteen years, until 1978, when Kapitsa shared the Nobel Prize with two American physicists who discovered cosmic microwave background radiation. Landau never regained his health or returned to his laboratory work before his death in 1968. By then, interest in superfluidity had surged again, especially as magnetic cooling enabled investigators to lower temperatures into the "millikelvin" range of a few thousandths of a degree above absolute zero.

In the 1970s, demagnetization was alternated with other cooling techniques, in a process that deliberately mimicked the operation of the ideal engine Carnot had imagined, to bring down temperatures and to produce a continuing series of revelations. Physicists had long assumed that superfluidity might be a widespread phenomenon, not simply a property of one form of liquid helium at low temperatures. These guesses received some verification from the work done in 1971 by three physicists at Cornell University, Robert C. Richardson, David M. Lee, and graduate student Douglas Osheroff. They discovered that at a temperature of two-thousandths of a degree above absolute zero, a new form of the element, known as helium-3, a form that had previously been thought capable of becoming a superfluid but had proved elusive, could be made into a superfluid. Their results were so unexpected that one prestigious journal initially rejected their article about the discovery. Superfluid helium-3, they found, was "anisotropic"—like a crystal, it appeared to have different values for properties depending on which axis was being measured. The new superfluid could act like a quantum microscope, permitting the direct observation of the effects caused by interactions among atoms. Richardson, Lee, and Osheroff would win the 1996 Nobel Prize for their work on creating a superfluid from helium-3.

Part of the reason they won the prize was that shortly after their discovery, using the Cornell trio's data, astronomers came to believe that the transition of helium-3 into a superfluid was analogous to the formation of the vast structures in space called cosmic strings, in the microseconds after the "big bang," and that superfluidity as a state or attribute of matter might be present in rotating neutron stars, thousands of light-years distant from Earth. This realization
was one instance of a grand coming-together of branches of science that had once seemed separate and unrelated. Connections were found that bolstered the links between the study of the behavior of matter at ultra-low temperatures, the study of subatomic particles, and the study of the origins of the universe. The ability to control the ultracold was the key to all three. Supersensitive devices based on cold technologies had become capable of measuring the entire electromagnetic spectrum, of registering images of the radiant heat of celestial objects in the infrared, millimeter-wave, and microwave range, as well as images of gravity and magnetic emanations. These could be used to identify various relics from the early days of the universe, leftovers from the era of the big bang. Among the dozen types of emissions scientists tried to find in the sky were fractional electric charges such as quarks; one theory holds that the universe began as a sea of quarks, some of which could have survived the big bang. Other emissions include WIMPs, (weakly interacting massive particles), and "stochastic" or random gravitational radiation, which might give clues to what happened during an early "phase transition" of the universe. Helium-3 was also found in the residues from volcanic eruptions, sometimes encapsulated or trapped between carbon atoms, and identified as remnants from the time of the formation of the earth.

Other books

Banewreaker by Jacqueline Carey
Witch Hunt (Witch Finder 2) by Ruth Warburton
Cushing's Crusade by Tim Jeal
The Red Parts by Maggie Nelson
Wildthorn by Jane Eagland