Galton saw this as natural selection with a twist, and felt it would provide “the more suitable races or strains of blood a better chance of prevailing speedily over the less suitable.” Galton posed the fundemental question in his 1869 book,
Hereditary Genius
: would it not be “quite practicable to produce a highly gifted race of men by judicious marriages during consecutive generations?” As the Yale historian Daniel J. Kevles points out in his definitive 1985 study
In the Name of Eugenics
, geneticists loved the idea and eagerly attempted to put it into action. Eugenics found particular favor in the United States.
In 1936, the American Neurological Association published a thick volume titled
Eugenical Sterilization
, a report issued by many of the leading doctors in the United States and funded by the Carnegie Foundation. There were chapters on who should be sterilized and who shouldn’t, and it was chock full of charts and scientific data—about who entered the New York City hospital system, for example, at what age and for what purpose. The board of the ANA noted in a preface that “the report was presented to and approved by the American Neurological Association at its annual meeting in 1935. It had such an enthusiastic reception that it was felt advisable to publish it in a more permanent form and make it available to the general public.”
Had their first recommendation appeared in a novel, no reader would have taken it seriously: “Our knowledge of human genetics has not the precision nor amplitude which would warrant the sterilization of people
who themselves are normal
[italics in the original] in order to prevent the appearance, in their descendants, of manic-depressive psychosis, dementia praecox, feeblemindedness, epilepsy, criminal conduct or any of the conditions which we have had under consideration. An exception may exist in the case of normal parents of one or more children suffering from certain familial diseases, such as Tay-Sachs’ amaurotic idiocy.” Of course, for people who were not considered normal, eugenics had already arrived. Between 1907 and 1928, nearly ten thousand Americans were sterilized on the general grounds that they were feebleminded. Some lawmakers even tried to make welfare and unemployment relief contingent upon sterilization.
Today, our knowledge of genetics has both the precision and the amplitude it lacked seventy years ago. The Nazis helped us bury thoughts of eugenics, at least for a while. The subject remains hard to contemplate—but eventually, in the world of genomics, impossible to ignore. Nobody likes to dwell on evil. Yet there has never been a worse time for myopia or forgetfulness. By forgetting the Vioxxes, Vytorins, the nuclear accidents, and constant flirtation with eugenics, and instead speaking only of science as a vehicle for miracles, we dismiss an important aspect of who we are. We need to remember both sides of any equation or we risk acting as if no mistakes are possible, no grievances just. This is an aspect of denialism shared broadly throughout society; we tend to consider only what matters to us now, and we create expectations for all kinds of technology that are simply impossible to meet. That always makes it easier for people, already skittish about their place in a complex world, to question whether vaccines work, or AIDS is caused by HIV, or why they ought to take prescribed pain medication instead of chondroitin or some other useless remedy recommended wholeheartedly by alternative healers throughout the nation.
IF YOU LIVED with intractable pain, would you risk a heart attack to stop it? What chances would be acceptable? One in ten? One in ten thousand? “These questions are impossible to answer completely,” Eric Topol told me when I asked him about it one day as we walked along the beach in California. “Merck sold Vioxx in an unacceptable and unethical way. But I would be perfectly happy if it was back on the market.”
Huh? Eric Topol endorsing Vioxx seemed to make as much sense as Alice Waters campaigning for Monsanto and genetically modified food. “I can’t stress strongly enough how deplorable this catastrophe has been,” he said. “But you have to judge risk properly and almost nobody does. For one thing, you rarely see a discussion of the effect of
not
having drugs available.” Risk always has a numerator and a denominator. People tend to look at only one of those numbers, though, and they are far more likely to remember the bad than the good. That’s why we can fear flying although it is hundreds of times safer than almost any other form of transportation. When a plane crashes we
see
it. Nobody comes on television to announce the tens of thousands of safe landings that occur throughout the world each day.
We make similar mistakes when judging our risks of illness. Disease risks are almost invariably presented as statistics, and what does it mean to have a lifetime heart attack risk 1.75 times greater than average? Or four times the risk of developing a certain kind of cancer? That depends: four times the risk of developing a cancer that affects 1 percent of the population isn’t terrible news. On the other hand, a heart attack risk 75 percent greater than average, in a nation where heart attacks are epidemic, presents a real problem. Few people, however, see graphic reality in numbers. We are simply not good at processing probabilistic information. Even something as straightforward as the relationship between cigarette smoking and cancer isn’t all that straightforward. When you tell a smoker he has a 25 percent chance of dying from cancer, the natural response is to wonder, “From
this
cigarette? And how likely is that really?” It is genuinely hard to know, so all too often we let emotion take over, both as individuals and as a culture.
The week in 2003 that SARS swept through Hong Kong, the territory’s vast new airport was deserted, and so were the city’s usually impassable streets. Terrified merchants sold face masks and hand sanitizer to anyone foolish enough to go out in public. SARS was a serious disease, the first easily transmitted virus to emerge in the new millennium. Still, it killed fewer than a thousand people, according to World Health Organization statistics. Nevertheless, “it has been calculated that the SARS panic cost more than $37 billion globally,” Lars Svendsen wrote in
A Philosophy of Fear
. “For such a sum one probably could have eradicated tuberculosis, which costs several million people’s lives every year.”
Harm isn’t simply a philosophical concept; it can be quantified. When Merck, or any another company, withholds information that would have explained why a drug might “fail,” people have a right to their anger. Nonetheless, the bigger problem has little to do with any particular product or industry, but with the way we look at risk. America takes the Hollywood approach, going to extremes to avoid the rare but dramatic risk—the chance that minute residues of pesticide applied to our food will kill us, or that we will die in a plane crash. (There is no bigger scam than those insurance machines near airport gates, which urge passengers to buy a policy just in case the worst happens. A traveler is more likely to win the lottery than die in an airplane. According to Federal Aviation Administration statistics, scheduled American flights spent nearly nineteen million hours in the air in 2008. There wasn’t one fatality.)
On the other hand, we constantly expose ourselves to the likely risks of daily life, riding bicycles (and even motorcycles) without helmets, for example. We think nothing of exceeding the speed limit, and rarely worry about the safety features of the cars we drive. The dramatic rarities, like plane crashes, don’t kill us. The banalities of everyday life do.
We certainly know how to count the number of people who died while taking a particular medication, but we also ought to measure the deaths and injuries caused when certain drugs are
not
brought to market; that figure would almost always dwarf the harm caused by the drugs we actually use. That’s even true with Vioxx. Aspirin, ibuprofen, and similar medications, when used regularly for chronic pain, cause gastrointestinal bleeding that contributes to the death of more than fifteen thousand people in the United States each year. Another hundred thousand are hospitalized. The injuries—including heart attacks and strokes—caused by Vioxx do not compare in volume. In one study of twenty-six hundred patients, Vioxx, when taken regularly for longer than eighteen months, caused fifteen heart attacks or strokes per every one thousand patients. The comparable figure for those who received a placebo was seven and a half per thousand. There was no increased cardiovascular risk reported for people who took Vioxx for less than eighteen months. In other words, Vioxx increased the risk of having a stroke or heart attack by less than 1 percent. Those are odds that many people might well have been happy to take.
“All Merck had to do was acknowledge the risk, and they fought that to the end,” Topol said. “After fifteen months of haggling with the FDA they put a tiny label on the package that you would need a microscope to find. If they had done it properly and prominently, Vioxx would still be on the market. But doctors and patients would know that if they had heart issues they shouldn’t take it.”
Most human beings don’t walk out the door trying to hurt other people. So if you are not deliberately trying to do harm, what are the rules for using medicine supposed to be? What level of risk would be unacceptable? A better question might be, Is any risk acceptable? Unfortunately, we have permitted the development of unrealistic standards that are almost impossible to attain. The pharmaceutical industry, in part through its own greed (but only in part), has placed itself in a position where the public expects it never to cause harm. Yet, drugs are chemicals we put into our body, and that entails risks. No matter how well they work, however, if one person in five thousand is injured, he could sue and have no trouble finding dozens of lawyers eager to represent him. People never measure the risk of keeping the drug off the market, though, and that is the problem. If you applied FDA phase I or II or III criteria—all required for drug approval—to driving an automobile in nearly any American city, nobody would be allowed to enter one. When we compare the risk of taking Vioxx to the risk of getting behind the wheel of a car, it’s not at all clear which is more dangerous.
2
Vaccines and the Great Denial
Marie McCormick is a studious and reserved woman with the type of entirely unthreatening demeanor that comes in handy in her job as a professor of pediatrics at the Harvard School of Public Health. She has devoted most of the past four decades to preparing physicians to nurture mothers and their children, and, since her days as a student at Johns Hopkins, has focused much of her research on high-risk newborns and infant mortality. Like many prominent academic physicians, her renown had largely been restricted to her field. Until 2001.
That was the year she was asked to lead a National Academy of Sciences commission on vaccine safety. The Immunization Safety Review Committee was established by the Institute of Medicine to issue impartial, authoritative, and scientifically rigorous reports on the safety of vaccinations. Its goal, while vital, seemed simple enough: bring clarity to an issue where too often confusion reigned. McCormick took on the assignment readily, although she was surprised at having been selected. It was not as if she considered vaccine safety unimportant—the issue had preoccupied her for decades. Nonetheless, vaccines were not McCormick’s area of expertise and she couldn’t help thinking that there must be someone better suited to the job. “My research has always been on the very premature,” she explained. “So I was a bit naive about why they might want me to run that committee.” She soon made a discovery that surprised her: “I realized that all of us on the committee were selected
because
we had no prior contact with vaccines, vaccine research, or vaccine policy. We all had very strong public health backgrounds, but we were just not clear about the nature or intensity of the controversy.”
The controversy that the panel set out to address was whether the benefits of receiving childhood vaccines outweighed the risks. In particular, the committee was asked to investigate the suggested link between the measles, mumps, and rubella inoculation routinely administered between the ages of one and two and the development of autism, which often becomes apparent at about the same time. The incidence of autism has risen dramatically during the past three decades, from less than one child in twenty-five hundred in 1970 to nearly one in every 150 today. That amounts to fifty new diagnoses of autism or a related disorder every day—almost always in children who seem to be developing normally, until suddenly their fundamental cognitive and communication skills begin to slip away.
Parents, understandably desperate to find a cause and often wholly unfamiliar with many diseases that vaccines prevent, began to wonder—publicly and vocally—why their children even needed them. There could be no better proof of just how effective those vaccines have been. With the sole exceptions of improved sanitation and clean drinking water, no public health measure has enhanced the lives of a greater number of people than the widespread adoption of vaccinations, not even the use of antibiotics. Cholera and yellow fever, both ruthless killers, are hardly known now in the developed world. Until vaccines were discovered to stop them, diphtheria and polio rolled viciously through America every year, killing thousands of children, paralyzing many more, and leaving behind ruined families and a legacy of terror. Both are gone. So is mumps, which in the 1960s infected a million children every year (typically causing them to look briefly like chipmunks, but occasionally infiltrating the linings of the brain and spinal cord, causing seizures, meningitis, and death).
Even measles, an illness that most young parents have never encountered, infected nearly four million Americans annually until 1963, when a vaccine was introduced. Typically, hundreds died, and thousands would become disabled for life by a condition called measles encephalitis. (In parts of the developing world, where vaccines are often unavailable, measles remains an unbridled killer: in 2007, about two hundred thousand children died from the disease—more than twenty every hour.) In the United States, fifty-two million measles infections were prevented in the two decades after the vaccine was released. Without the vaccine, seventeen thousand people would have been left mentally retarded, and five thousand would have died. The economic impact has also been dramatic: each dollar spent on the MMR vaccine saves nearly twenty in direct medical costs. That’s just money; in human terms, the value of avoiding the disease altogether cannot be calculated. By 1979, vaccination had even banished smallpox, the world’s most lethal virus, which over centuries had wiped out billions of people, reshaping the demographic composition of the globe more profoundly than any war or revolution.
Those vaccines, and others, have prevented unimaginable misery. But the misery is only unimaginable to Americans today because they no longer need to know such diseases exist. That permits people to focus on risks they do confront, like those associated with vaccination itself. Those risks are minute, and side effects are almost always minor—swelling, for instance; a fever or rash. Still, no medical treatment is certain to work every time. And serious adverse reactions do occur. If you hunt around the Internet for an hour (or ten) you might think that nobody pays attention to vaccine safety in America today. The Public Health Service has actually never been more vigilant. For example, in 1999 the Centers for Disease Control called for an end to the use of the oral polio vaccine, developed by Albert Sabin, which, because it contained weakened but live virus, triggered the disease in about ten people out of the millions who took it each year. (A newer injectable and inactivated version eliminates even this tiny threat.) Despite legitimate concerns about safety, every vaccine sold in the United States is scrutinized by at least one panel of outside advisers to the Food and Drug Administration before it can be licensed; many don’t even make it that far. As a result, vaccination for virtually every highly contagious disease is never as dangerous as contracting the infections those vaccines prevent.
Prevention is invisible, though, and people fear what they cannot see. Nobody celebrates when they avoid an illness they never expected to get. Humans don’t think that way. Choosing to vaccinate an infant requires faith—in pharmaceutical companies, in public health officials, in doctors, and, above all, in science. These days, that kind of faith is hard to come by. So despite their success, there has been no more volatile subject in American medicine for the past decade than the safety of vaccines. There is a phrase used commonly in medicine: “true, true, and unrelated.” It is meant to remind physicians not to confuse coincidence with cause. That kind of skepticism, while a fundamental tenet of scientific research, is less easily understood by laymen.
For most people, an anecdote drawn from their own lives will always carry more meaning than any statistic they might find buried in a government report. “Neither my husband nor anyone in his family has ever been vaccinated . . . and there isn’t a single person in his family who has ever had anything worse than a cold,” one woman wrote on the heavily read blog Mom Logic. “Myself and my family, on the other hand, were all vaccinated against every possible thing you could imagine. . . . Somehow we all got the flu every single year. Somehow everyone in my family is chronically ill. And amazingly, when the people in my family reach 50 they are all old and deteriorated. In my husband’s family they are all vibrant into their late 90’s. My children will not be vaccinated.”
This particular epidemic of doubt began in Britain, when the
Lancet
published a 1998 study led by Dr. Andrew Wakefield in which he connected the symptoms of autism directly to the MMR vaccine. The study was severely flawed, has been thoroughly discredited, and eventually ten of its thirteen authors retracted their contributions. Yet the panic that swept through Britain was breath-taking: vaccination rates fell from 92 percent to 73 percent and in parts of London to nearly 50 percent. Prime Minister Tony Blair refused repeatedly to respond to questions about whether his youngest child, Leo, born the year after Wakefield’s study, received the standard MMR vaccination. Blair said at the time that medical treatment was a personal matter and that inquiries about his children were unfair and intrusive. No virus respects privacy, however, so public health is
never
solely personal, as the impact on Britain has shown. England and Wales had more cases of measles in 2006 and 2007 than in the previous ten years combined. In 2008, the caseload grew again—this time by nearly 50 percent. The numbers in the United States have risen steadily as well, and the World Health Organization has concluded that Europe, which had been on track to eliminate measles by 2010, is no longer likely to succeed. Vaccination rates just aren’t high enough.
Fear is more infectious than any virus, and it has permitted politics, not science, to turn one of the signature achievements of modern medicine into fodder for talk show debates and marches on Washington. Celebrities like Jenny McCarthy, who oppose the need for a standard vaccination schedule, denounce celebrities like Amanda Peet who are willing to say publicly that the benefits of vaccines greatly outweigh the risks. Peet represents Every Child by Two, a nonprofit organization that supports universal vaccination. Not long after she began speaking for the group, Peet and McCarthy began to clash. At one point, McCarthy reminded Peet that she was right because “there is an angry mob on my side.” When three physicians, appearing on
Larry King Live
, disagreed with McCarthy, she simply shouted “Bullshit!” in response. When that didn’t shut them up, she shouted louder. Data, no matter how solid or frequently replicated, seems beside the point.
What does it say about the relative roles that denialism and reason play in a society when a man like Blair, one of the democratic world’s best-known and most enlightened leaders, refused at first to speak in favor of the MMR vaccine, or when a complete lack of expertise can be considered a
requirement
for participation in America’s most prominent vaccine advisory commission? “Politically, there is simply no other way to do it,” Anthony S. Fauci explained. “Experts are often considered tainted. It is an extremely frustrating fact of modern scientific life.” Fauci has for many years run the National Institute of Allergy and Infectious Diseases, where at the beginning of the AIDS epidemic he emerged as one of the public health establishment’s most eloquent and reliably honest voices. He shook his head in resignation when asked about the need for such a qualification, but noted that it has become difficult to place specialists on committees where politics and science might clash. “You bring people with histories to the table and they are going to get pummeled,” he said. “It would simply be war.”
War is exactly what the vaccine commission got. During McCormick’s tenure, the National Academy of Sciences published several reports of its findings. In a 2001 study,
Measles-Mumps-Rubella Vaccine and Autism
, the committee concluded that there was no known data connecting MMR immunizations with the spectrum of conditions that are usually defined as autism. The wording left room for doubt, however, and the report resolved nothing. Three years later, with vaccination rates falling in the United States and anxiety among parents increasing rapidly, and after many calls from physicians for clearer and more compelling guidance, the committee revisited the issue more directly.
Even at the height of the age of AIDS, when members of the activist group ACT UP stormed St. Patrick’s Cathedral, surrounded the White House, shut down the New York Stock Exchange, and handcuffed themselves to the Golden Gate Bridge, all to protest the prohibitive cost of drug treatments and the seemingly endless time it took to test them, rancor between researchers and the advocacy community was rare. The contempt AIDS activists felt for federal officials—particularly for the Food and Drug Administration and its cumbersome regulations—was palpable. Even the most strident among them however, seemed to regard physicians as allies, not enemies.
Those days have ended, as the Institute of Medicine vaccine committee came to learn. For years, the culprits most frequently cited as the cause of autism had been the measles, mumps, and rubella vaccine, as well as those that contained the preservative thimerosal. Thimerosal was first added to vaccines in the 1930s in order to make them safer. (Before that, vaccines were far more likely to cause bacterial infections.) While descriptions of autistic behavior have existed for centuries, the disease was only named in 1943—and its definition continues to evolve. Neurodevelopmen tal illnesses like autism have symptoms similar to those of mercury poisoning, and there is mercury in thimerosal. What’s more, American children often receive a series of vaccinations when they are about eighteen months old. That is a critical threshold in human development, when a child often begins to form simple sentences and graduates from chewing or pawing toys to more engaging and interactive forms of play. Some children don’t make that transition—and because they receive so many shots at the same time, many parents feared, naturally enough, that the inoculations must have been the cause.
Anguished parents, who had watched helplessly and in horror as their children descended into the disease’s unending darkness, could hardly be faulted for making that connection and demanding an accounting. The Immunization Safety Review Committee was supposed to provide it, although its members represented an establishment trusted by few of those who cared most passionately about the issue. AIDS activism had its impact here too, because it changed American medicine for good: twenty-first-century patients no longer act as if their doctors are deities. People demand to know about the treatments they will receive, and patient groups often possess more knowledge than the government officials entrusted to make decisions about their lives. They have every right to insist on a role in treating the diseases that affect them.
The rise of such skepticism toward the scientific establishment (as well as the growing sense of anxiety about environmental threats to our physical health) has led millions to question the authority they once granted, by default, not only to their doctors, but also to organizations like the National Academy of Sciences. Faced with the medical world that introduced, approved, and relentlessly promoted Vioxx, a patient can hardly be blamed for wondering, “What do these people know that they are not telling me?” Uncertainty has always been a basic ingredient of scientific progress—at least until reason is eclipsed by fear. Unlike other commodities, the more accessible knowledge becomes, the more it increases in value. Many autism activists, however, sensed that federal health officials and researchers who work with them were guilty of avarice and conspiracy, or at least of laziness—guilty until proven innocent (and innocence is hard to prove). To use Fauci’s formulation, when experts are tainted, where can you place your trust?