The New York Times
, a stalwart supporter of American expansion abroad and compulsory vaccination of the urban masses at home, applauded Wheaton's speech. “The anti-imperialist, with his tender regard for the inclinations and preferences of all races except his own, will doubtless object that it is no favor to save the lives of people by forcing them to follow customs and endure Governments distasteful to them,” the
Times
noted. “[B]ut with the world as small as it is nowadays, this argument is decidedly weak. . . . The unsanitary have become public enemies, and modern war, with its enormous evils, does spread habits of clean living among ânatives' and the âunprogressives' whom it leaves alive.” As American officials, commentators, and scholars praised the new levels of sanitation, hygiene, and health that the American efforts had brought to the peoples of Cuba, Puerto Rico, and the Philippinesâfrom the old Spanish ports to the rural interiorsâa new rhetoric of justification for military action crystallized. U.S. military medicine had preserved the health of the soldiers, protected American commercial interests, and saved the lives of countless natives.
107
Health administration would remain an integral part of U.S. colonial rule in the Philippine archipelagoâand also a principal means of justifying that rule. Americanized Manila stood as a model of the healthful city. In the 1904 fiscal year, the board of health had vaccinated 213,000 people in Manila and an additional 1,007,204 people in the provincesâwell over one eighth of the entire population of the archipelago. American-made vaccine, packed for shipment in special boxes of ice, was reaching the people of the interior on horse-drawn
carromatas
, in water-borne
bancas
, and on the backs of Igorot runners. Local officials placed orders for vaccine over the telegraph wires the Americans had installed. Marine-Hospital Service surgeons vaccinated thousands of sailors each year in the harbors and pressed shipping firms to employ only persons holding the Service's blue vaccination cards.
108
By 1906, the Philippine Commission was boasting of the real possibility of eradication: “The day should not be far distant when smallpox will disappear from the Philippines.” The following year, Dr. Victor Heiser, the U.S. director of health, stated the argument in its baldest form. “During the year there has been unquestionably less smallpox in the Philippines than has been the case for a great many years previous.... In fact, if any justification were needed for American occupation of these islands, these figures alone would be sufficient, if nothing further had been accomplished for the benefit of the Filipinos.” Between the arrival of the U.S. troops in the summer of 1898 and 1915, some
18 million
vaccinations were performed in the Philippines under American rule. The Filipinos, according to U.S. officials, had come to accept vaccination as an effective and necessary measure, suggesting, if true, a dramatic transformation of medical beliefs in a very short time.
109
With the end of the war, the question of force became the greatest political liability of U.S. colonial health policy. Significantly, in 1904 the Philippine Commission ordered that public vaccinators would henceforth be “prohibited from using force in accomplishing vaccinations.” Individuals who refused to submit to vaccination would be tried in the courts. All of these ongoing efforts did not succeed in completely wiping out smallpox on the islands. The tropical climate continued to render much of the American-produced vaccine useless. But the efforts did dramatically reduce the incidence of smallpox there and laid the groundwork for the Philippines to become, in 1931, the first Asian country in which the disease was eradicated.
110
At a time of pervasive opposition to compulsory vaccination at home and abroad, U.S. health officials presented the vaccination campaigns in Puerto Rico and the Philippines as evidence of the efficiency of compulsion. Azel Ames touted the Puerto Rico campaign as “A Lesson for the World.” Surgeon General Walter Wyman of the U.S. Public Health and Marine-Hospital Service declared, “No greater proof as to the efficacy of vaccination exists than in the Philippine Islands.” For Dr. John E. Snodgrass, assistant to the director of health in Manila, the truth of that proposition could be seen in the scarless faces of the rising generation of Filipinos. “The only argument necessary to explode the theories of the anti-vaccinationists,” he proclaimed before the Panama-Pacific International Exposition in 1915, “is to compare the visages of the children of today with those of their parents.”
111
FIVE
THE STABLE AND THE LABORATORY
Far from the battlefields of the nation's first overseas colonial wars, American health officials on the U.S. mainland encountered rising resistance after 1900 to their own widening war on smallpox. The contentious politics of smallpox control centered on the growing divide between public health authorities and the public itself regarding the risks of vaccination.
Turn-of-the-century Americans lived in a world filled with risk. Each year one out of every fifty workers was killed on the job or disabled for at least four weeks due to a work accident. Railroad and streetcar accidents annually killed and maimed tens of thousands of people. Children worked in mines, stole rides on the back of moving cars, and played stickball in alleys carpeted with horse manure. Apart from a few things recognized by the courts as “imminently dangerous,” such as arsenic or nitroglycerin, product liability did not exist. The average American breadwinner carried just enough insurance to cover his own burial.
1
During the first two decades of the twentieth century, a spate of new progressive social policies would create an enlarged role for the American government in managing the ordinary risks of modern urban-industrial life. The resulting “socialization” of risk, though narrow by the standards of Britain and Germany, was a dramatic departure for American institutions that prized individual freedom and responsibility. European-style social insurance gained traction in the first American workman's compensation laws, enacted in forty-two states between 1911 and 1920. Mothers' pension programs (launched in forty states during the same decade) provided aid to families that lost the wages of the “normal” (male) breadwinner due to his sudden death or disability. In tort law, too, the courts had women and children first in mind as they imposed tougher standards of liability upon railroad corporations. U.S. social politics still had a long way to go before a recognizably modern national welfare state insured its citizens against the financial insecurities of old age, or an American court seriously entertained the argument that an exploding Coke bottle entitled the injured party to compensation from the manufacturer. But the foundation was laid, in the social and political ferment of the Progressive Era, for a government that would one day promise its citizens “freedom from fear.”
2
Arriving just as the American people and their policy makers began to seriously debate these issues, the turn-of-the-century smallpox epidemics raised broad public concerns about the quality and safety of the nation's commercial vaccine supply. The ensuing controversy caused ordinary Americans, private physicians, and public officials to revise old expectations about risk and responsibility and the role of government in managing both.
3
By the fall of 1901, the wave of American epidemics had carried small-pox to every state and territory in the union. The new mild type smallpox was the culprit in the majority of places, but deadly variola major struck several major American cities, particularly in the Northeast. Compulsory vaccination was the order of the day, enforced at the nation's borders, in cities and towns, at workplaces, and, above all, in the public schools. The public policy was a boon to the vaccine industry, driving up demand for smallpox vaccine. American vaccine makers of the day ranged in size from rising national pharmaceutical firms such as Detroit's Parke, Davis & Company and Philadelphia's H. K. Mulford Company (a U.S. forerunner of today's Merck) to the dozens of small “vaccine farms” that sprouted up around the country. To meet the unprecedented demand for vaccine-coated ivory points or capillary tubes of liquid lymph, the makers flooded the market with products, some inert, some “too fresh,” and some seriously tainted. Complaints of vaccine-induced sore arms and feverish bodies filled the newspapers and medical journals. Every family seemed to have its own horror story.
Popular distrust of vaccine surged in the final months of the year, as newspapers across the country reported that batches of tetanus-contaminated diphtheria antitoxin and smallpox vaccine had caused the deaths of thirteen children in St. Louis, four in Cleveland, nine in Camden, and isolated fatalities in Philadelphia, Atlantic City, Bristol (Pennsylvania), and other communities. In all but St. Louis, where antitoxin was the culprit, the reports implicated vaccine. Even
The New York Times
, a relentless champion of compulsory vaccination, expressed horror at the news from Camden, the epicenter of the national vaccine scare. “Vaccination has been far more fatal here than smallpox,” the paper told its readers. “Parents are naturally averse to endangering their children to obey the law, claiming that the chances of smallpox seem to be less than those of tetanus.”
4
Pain, sickness, and the occasional death after vaccination were nothing new. But the clustering, close sequence, and staggering toll of these events was unprecedented in America. Newspaper stories of children dying in terrible agonyâtheir jaws locked and bodies convulsing, as helpless parents and physicians bore witnessâturned domestic tragedies into galvanizing public events. Allegations of catastrophic vaccine failure triggered extraordinary levels of conflict between angry citizens and defensive officials. In one typical incident, which occurred as the ninth Camden child entered her death throes, the health officials of Plymouth, Pennsylvania, discovered that many parents, ordered to get their children vaccinated for school, were secretly wiping the vaccine from their sons' and daughters' arms.
5
Jolted from their professional complacency, physicians and public health officials were forced to reconsider the existing distribution of coercion and risk in American public health law. In one sense, compulsory vaccination orders, whether they applied only to schoolchildren or to the public at large, already socialized risk. The orders imposed a legal duty upon individuals (and also parents) to assume the risks of vaccination in order to protect the entire community from the presumably much greater danger of smallpox. Spreading the risk of vaccination across the community made its social benefit (immunity of the herd) seem a great bargain. As any good progressive knew, the inescapable interdependence of modern social life required just such sacrifices for the public welfare and the health of the state. Still, the state did almost nothing to ensure vaccine quality. The bacteriological revolution spawned a proliferating array of “biologics”âvaccines, antitoxins, and sera of endless varietyâthat were manufactured in unregulated establishments and distributed, by the companies' druggist representatives and traveling detail men, in unregulated markets. The risks of these products lay where they fellâon the person left unprotected by an inert vaccine or poisoned by a tainted one.
6
The situation illustrates the larger dualism of American law at the turn of the century. Ordinary Americans, particularly working-class people, were caught between the increasingly strong state presence in their everyday social lives and the relatively weak state regulation of the economy. And the government insulated itself from liability. In a leading decision, handed down just three years before the Camden crisis, the Georgia Supreme Court took up the question of whether a municipal government could be sued for injuries caused by bad vaccine used by its public vaccinators. The answer was an unblinking
No
. Citing “a principle as old as English law, that âthe King can do no wrong,'” the court refused to allow a resident of Rome, who had submitted to vaccination “under protest,” to sue the government for using “vaccine matter which was bad, poisonous and injurious, and from which blood poisoning resulted.” To allow such a case to proceed, the court warned, “would be to paralyze the arm of the municipal government, and either render it incapable of acting for the public weal, or would render such action so dangerous that the possible evil consequences to it, resulting from the multiplicity of suits, might be as great as the smallpox itself.” The arm of the state was protected; the arm of the citizen was not.
7
Supporters of compulsory vaccination defended the policy in a quasi-scientific rhetoric of risk assessment. From the expert point of view, lay concerns about vaccine safety were steeped in ignorance and fear, which should have evaporated in the face of hard statistical evidence. Officials assured the public that vaccines were safer than ever: “the preparation of glycerinized vaccine lymph has now been brought to such perfection that there should be no fear of untoward results in its use,” Surgeon General Walter Wyman said three years before Camden. Even if untoward results did arise, the social benefits of vaccination outweighed the costs. As the
Cleveland Medical Journal
put it, “Better [by] far two score and ten sore arms than a city devastated by a plague that it is within our power to avert.”
8
The vaccine crisis of 1901â2 revealed that cost-benefit analysis was not the only way Americans thought about risk. When the
Times
observed that Camden parents reasonably concluded that vaccination had become more dangerous than smallpox, turning the public health argument on its head, the paper made a rare concession to vaccination critics. As the
Times
said, the incidents were “furnishing the anti-vaccinationists with the only good argument they have ever had.” But most worried parents would not have called themselves “anti-vaccinationists.” And much more was involved in the rising popular resistance to vaccination in 1901 than a cool-headed consideration of quantifiable facts.
9