Authors: Natalie Angier
Egos and academic mastodons notwithstanding, scientists are deeply skeptical when they hear amazing new results, and with good reason: many of these results are bad, are more awful than offalâa product that at least has a shot at fertilizing something better down the line. "Most of the time, when you get an amazing, counterintuitive result," said Michael Wigler of Cold Spring Harbor Lab, "it means you screwed up the experiment."
People have the mistaken impression that the great revolutions in the history of science overturned prevailing wisdom. In fact, most of the great ideas subsumed their predecessors, gulped them whole and got bigger in the act. Albert Einstein did not prove that Isaac Newton was wrong. Instead, he showed that Newton's theories of motion and gravity were incomplete, and that new equations were needed to explain the behavior of objects under extreme circumstances, such as when tiny particles travel at or near the speed of light. Einstein made the pi wider and lighter and more exotically scalloped in space and time. But for the workaday trajectories of Earth spinning around the sun, or a baseball barreling toward a bat, or a brand-new earring sliding down a drain, Newton's laws of motion still apply.
"The rules of science are quite strict," said the Berkeley astronomer Alex Filippenko. "I get messages every day from people who have ideas that sound interesting but that are terribly incomplete. I tell them, Look, you have to formulate your proposal much more coherently, in a way that explains not only the one new thing you're concerned with, but that is consistent with everything else we know, too. Any new, revolutionary idea has to explain the existing body of knowledge at least as well as the ideas we already accept."
On very rare occasions, scientists present a revolutionary idea in such a compelling, comprehensive, and vine-ripened form that even the skeptics are sold. One example is the famously brief paper in the April 1953 issue of the journal
Nature
by James Watson and Francis Crick, describing the incomparably uncluttered structure of deoxyribonucleic acid, or DNA. For years, many of the world's great geneticists were convinced that proteins, rather than nucleic acids, carried genetic information in the cell. Their reasoning was simple. Proteins are complex. They are the most complex molecules known in the cell. Genetic information seems pretty complex. Who better to bear the burden of complexity than the complex? On beholding the elegance of the double helix, however, and the smartness with which the four subunits of the twisting ladder paired with one another, and the ease with which one strand of the molecule might serve as a template for creating an entirely new
copy of DNA to bequeath to a daughter cell, geneticists realized how the entire story of life could be told in its taciturn code.
Another legendary wowzer occurred at a geoscience meeting in the 1960s, when researchers offered evidence for plate tectonics, the theory that explains the origins of the ragged peaks and plunging canyons, the sputtering fumaroles and shimmering lava flows, and all the other Ansel Adams centerfolds that surround us. Lucy Jones's thesis adviser was at the meeting and told her how extraordinary the presentation was. "The evidence was so overwhelming, so compelling," she said, "that nobody could argue with it." Even more surprising, she added, "nobody wanted to."
Such
Rocky
triumphs, though, are extremely atypical. More often, scientists carp and cavil, demand better controls, offer a contrarian interpretation of the results, or write snide comments in the margins of a peer's manuscript. More often, science progresses fitfully, and individual experimental results are as modest as a bee's cerebrum. This is not an indictment against science. The power of science lies precisely in its willingness to attack a big problem by dividing it into many small pieces, its embrace of the unfairly maligned practice known as reductionism. At the same time, the piecemeal approach demands that scientists be circumspect to an often tedious degree and that they resistâno matter how much they are pushed by their university's public relations department or by desperate journalistsâmaking more of the data than the data make of themselves. It would be cheating to do otherwise. It would be cheating to declare that science works by isolating variables, one colored peg at a time; and then to decide, when you've got a handsome little result, that, whaddya know, you're a holist at heart, and that Whitman had a point about the universe being in every blade of grass. The best scientists don't overreach or grandstand, at least not until they've retired into the armchair comforts of emeritus professorship, a time of life sometimes referred to as philosopause.
For working scientists, by contrast, all chairs are folding chairs: here today, tossed in the closet tomorrow. Scientists are accustomed to uncertainty, and to admitting how little they know. In fact, not only are they accustomed to uncertaintyâthey thrive on it. This is another of the core messages they'd like people to absorb, right down to their stem cells if possible: that science is an inherently uncertain enterprise, and that the uncertainty is, paradoxically, another source of its power. "We're out there looking for new patterns, new laws, new fundamentals, new
uncertainties
" said Andy Ingersoll, an astronomer at Caltech. "And as we're looking, and discovering new things, we're debating about
what we see. We express our differences of opinion, sometimes strongly, until the public gets confused. Doesn't science know the answer to anything? Well, yes, eventually a consensus may be reached about a particular problem. But by then, we've already moved on to the next uncertainty, the next unknown. You don't linger." Ignorance is bliss, and always an excuse. "What motivates scientists is a lack of information rather than the presence of information," said Scott Strobel. Sometimes a consensus really is consensual, as it overwhelmingly is with Darwin's theory of evolution by natural selection (and more on this profoundly important organizing principle of biology, and the circus of manufactured tsuris that surrounds it, later), and as it firmly is in the case of global warming. For all the talk of "controversy," the great majority of climate scientists concur that average temperatures on Earth are climbing, and that some, if not all, of the rise is the result of human activity, notably the compulsive burning of combustible materials to power every aspect of contemporary life, including the need for more air-conditioning.
At other times, a scientific consensus amounts to little more than mass agnosticism. Take the question of whether chemical pollutants contribute to breast cancer. On the one hand, many industrial chemicals have been shown to cause breast tumors in lab animals; inherited factors fall short of explaining most human cases of the disease; and breast cancer rates vary significantly from nation to nation, all suggesting that environmental carcinogens somehow contribute to the malignancy. On the other hand, study after study seeking to link pesticides, power plants, or other specific environmental insults to human cancer have failed to reveal any convincing connection, leaving most scientists either skeptical or resolutely noncommittal about the contribution of chemical pollutants to breast cancerâmuch to activists' dismay.
"You don't want people to think that science is a joke, and that we don't know anything," said the Caltech astronomer Chuck Steidel, "but the truth is that the process of reaching a consensus is extremely messy and requires that a huge number of hurdles be overcome. Often, when results are presented to the general public, they're made out to be much more rock-solid than they are."
Science is uncertain because scientists really can't prove anything, irrefutably and beyond a neutrino of a doubt, and they don't even try. Instead, they try to rule out competing hypotheses, until the hypothesis they're entertaining is the likeliest explanation, within a very, very small margin of errorâthe tinier, the better. "Working scientists don't think of science as 'the truth,'" said Darcy Kelley. "They think of it as a way of
approximating
the truth." By accepting the proximate and provisional nature of what they're working on, scientists leave room for regular upgrades, which, unlike many upgrades to one's computer operating system, are nearly always an improvement on the previous model. For example, after scientists determined that DNA, rather than proteins, served as nature's preeminent guardian of genetic information, they began to see that DNA was not the sole guardian of the code of life, and almost certainly wasn't the original one. They gradually gained respect for RNA, the molecule they once dismissed as a mere bureaucrat paper-clipped between the imperial DNA that issues commands in the cell and the industrious proteins that do the cell's work without surcease. Scientists spied in RNA many talents that made it a likely ancestor of DNA, the primordial vessel of heredity and continuity back when life was new; only later did RNA cede its replicative and procreative role to the sturdier strands of DNA.
More recently, scientists have amassed evidence that some proteins, called prions, can act like DNA after all, replicating in the brains of mad cows and their unlucky human consumers. The discovery of prions and their infectious, photocopying potential earned a Nobel Prize for Stanley Prusiner in 1997.
None of these findings undermine the strength of the original Watson-Crick discovery. "Just because RNA and proteins can carry information in some circumstances doesn't detract from the centrality of DNA as the primary bearer of hereditary information," said David Baltimore. "As our concepts become more precise, more sophisticated, the absolutes become less absolute." In other words, by accepting that they can never
know
the truth but can only approximate it, scientists end up edging ever closer to the truth. The tonic surgery of chronic uncertainty.
For those outside the operating theater, however, all the quarreling, the hesitation, the emendations and annotations, can make science sound like a pair of summer sandals. Flip-flop, flip-flop! One minute they tell us to cut the fat, the next minute they're against the grains. Once they told us that the best thing to put on a burn was butter. Then they realized that in fact butter makes a burn spread; better use some ice instead. All women should take hormone replacement therapy from age fifty onward. All women should
stop
taking hormone therapy right now and never mention the subject again. Didn't scientists predict in the 1960s that a population bomb was about to explode, and that we'd all die of starvation or crowd rage? Now demographers in developed countries fret that women aren't breeding fast enough to restock the tax
base and that nobody will be around to pay tomorrow's nursing home bills. Why should we believe anything scientists say? For that matter, why should we do anything that scientists suggest, like thinking about global climate change and the inevitable depletion of Earth's fossil fuels and adjusting our energy policies accordingly? That's what scientists say today. But if I hang on to my Hummer long enough, hey, maybe they'll decide that extravagant plumes of exhaust fumes are good for the environment after all!
This is one of science's bigger public relations problems. How do you convey the need for uncertainty in science, the crucial role it plays in nudging research forward and keeping standards high, without undermining its credibility? How can you avoid the temptations of dogmatism and certitude without risking irrelevance? "People need to understand that science is dynamic and that we do change our minds," said Dave Stevenson. "We have to. That's how science functions.
"Part of critical thinking," he added, "includes the understanding that science doesn't deal with absolutes. Nonetheless, we can make statements that are quite powerful and that have a high probability of being correct."
One trick to critical thinking is to contrast it with cynicism, which happens to be one of my most comfortable and least welcome of mental states. Cynics dismiss all offerings, sight unseen, data unmulled. Another drug that cures breast tumors in mice? Go tell it to Minnie. The fossil of a new dinosaur species disinterred? I can hear Stephen Jay Gould grumbling from the great beyond: Dinosaurs are a cliché. Preemptive cynicism may be rooted in insecurity, defensiveness, a gloomy disposition, or simple laziness; whatever its cause, it is useless.
Deborah Nolan of the University of California, Berkeley, encounters it constantly in her introductory statistics courseâthe slapdash bashing, the no-it-all choir. She confronts cynicism calmly and strives to replace it with hard-nosed thought. Each semester she'll present her students with newspaper stories that describe an array of medical, scientific, or sociological studies: Should victims of gunshot wounds be resuscitated by the paramedics in the ambulance, through drugs delivered intravenously, or is it better to wait until they get to the hospital? Does a surgeon perform better while listening to music in the operating room, or not? Does the mental well-being of a mother have a greater impact on her interaction with an infant, or with a toddler? Nolan will ask the students for their impressions of the articles. Regardless of the subject matter, or whether the students are majoring in science, the liberal arts, or hotel management, their initial response is the same: a synchronized sneer. You can't believe what you read in the newspapers, they'll insist. Nolan asks them what, precisely, they don't believe about the stories. They examine the articles again, this time with more care. Well, it's just ... why
should
I believe it?
Nolan then shows them the original journal studies on which the newspaper stories were based, and she and the students begin, methodically, to pick the studies apart. They consider who the research subjects were, whether the participants were divided into two or multiple groups, the basis on which they were assigned to one group or another, and how the groups were compared. They discuss the strengths and limitations of the study, and why they think the researchers designed it as they did, and what the students might have done differently if they were running the study themselves. Enlightened now with this insider's intelligence, the students then reread the newspaper stories, to see if the reporters accurately conveyed the essence of the studies.