The Coming Plague (37 page)

Read The Coming Plague Online

Authors: Laurie Garrett

BOOK: The Coming Plague
11.49Mb size Format: txt, pdf, ePub
Widespread famine followed the end of the war, claiming at least 50,000 lives. Wildlife conservation groups throughout the world protested as starving Ugandans slaughtered and consumed elephants, hippos, elands, giraffes, monkeys, and other animals by the thousands.
Between 1975 and 1980, Uganda, its entire health infrastructure devastated, experienced epidemics of malaria, leprosy, tuberculosis, cholera, visceral leishmaniasis (kala-azar), and virtually every vector-borne ailment known to the continent.
33
A French team found evidence of more exotic
diseases as well, when they took blood surveys of villagers in western Uganda. Ebola, Marburg, Lassa, West Nile fever, Crimean-Congo hemorrhagic fever, and Chikungunya were among the viruses found in the blood of the region's populace.
34
Between 1971 and 1977, Uganda had its worst measles epidemic in over forty years, with high death rates among children seen all over the country. So great was the country's chaos that no agency kept count of the death toll. Gonorrhea soared during the Amin years, particularly among soldiers. Because the country was bereft of antibiotics, most cases went untreated. Routine vaccination for such diseases as whooping cough and tetanus came to a halt, and the incidence of these diseases rose dramatically.
Starving, sick refugees poured by the tens of thousands across borders to Zaire and Sudan, taking their diseases with them.
Makerere University, which had been the primary medical training center for East Africa's doctors, was looted right down to its electrical sockets and bathroom tiles. By the end of the 1970s, the nation of Uganda would be completely out of toilet paper, antibiotics, aspirin, sterilizers, cotton wool, bed linens, soap, clean water, light bulbs, suturing equipment, and surgical gowns.
35
Rumors of strange disease outbreaks were rampant, but there was nobody left to investigate these claims.
Such tragic events, with the resultant epidemics and health crises, were mirrored all over the world. From Pol Pot's reign of terror in Cambodia to the Cold War-manipulated battlefields of Central America, the world's poorest countries spent extraordinary amounts of money on domestic military operations and warfare. And the microbes exploited the war-ravaged ecologies, surging into periodic epidemics.
The World Health Organization, with a staff of only 1,300 people and a budget smaller than that spent on street cleaning every year by the city of New York, tried to combat such seemingly intractable public health problems with donated vaccines, technical assistance, and policy statements.
36
On September 12, 1978, WHO convened a meeting of ministers of health from over 130 nations in Alma-Ata
37
in the U.S.S.R. The conference issued what would be hailed years later as a pivotal document in the international public health movement: the Declaration of Alma-Ata. Inspired in part by U.S. Surgeon General Julius Richmond's Health Goals 1990, which in 1975 systematically outlined the status of Americans' health and set goals for improvement, the Alma-Ata Declaration called for “the attainment by all peoples of the world by the year 2000 of a level of health that will permit them to lead a socially and economically productive life.”
The ten-point Alma-Ata Declaration defined health as “a state of complete physical, mental, and social well-being, not merely the absence of disease or infirmity,” and declared it “a fundamental human right.” It decried health care inequities, linked human health to economic development,
and called upon the governments of the world to develop financially and geographically accessible primary health care facilities for all their people.
Declaring health a human right forced issues of disease control onto the newly powerful agenda of global civil liberties. In 1976 the UN General Assembly voted to enter into force the International Covenant on Civil and Political Rights.
38
It was the strongest vilification of tyranny, discrimination, violations of basic freedoms, and injustice ever passed by the UN. Also that year the UN passed the International Covenant on Economic, Social, and Cultural Rights,
39
which specifically recognized “the right of everyone to the enjoyment of the highest attainable standard of physical and mental health.”
John Evans of the World Bank elucidated three key demarcations in health problems that he felt were tied to the economic development and status of each nation: infectious disease stage; mixed phase; and chronic disease state. In the poorest, least developed nations of the world, the majority of the population suffered illness and death due to communicable and vector-borne diseases. With improvements in economic development, Evans said, came a painful period of mixing, in which the poorer members of society succumbed to infectious diseases while the wealthier urban residents lived longer, disease-free lives that were eventually cut short by chronic ailments such as cancer and heart disease.
In the most developed nations, Evans argued, infectious diseases ceased being life-threatening, some disappeared entirely, and the population generally lived into its seventh decade, succumbing to cancer or heart disease. The bottom line, from Evans's perspective, was that infectious diseases would no longer pose a significant threat to postindustrial societies.
“We must never cease being vigilant,” Richmond said, “but it is altogether proper to shift resources towards prevention of chronic diseases. With political will, tremendous strides can be made.”
Though the World Bank perspective informed most long-term planning, there were voices within the academic public health community who loudly questioned the three-phase assumptions. While not disputing that curative medicine had made genuine strides, particularly since the 1940s, and agreeing that control of disease was linked to societal wealth, they refuted the idea that there might be a direct correlation between stages of national development and individual disease. In their view, the ecology of disease was far more complex, and waves of microbial pestilence could easily occur in countries with enormous gross national products. Conversely, well-managed poor countries could well control pestilence in their populations.
The debate centered on a two-part question: when and why did most infectious diseases disappear from Western Europe, and what relevance did that set of events have for improving health in the poorest nations in the last quarter of the twentieth century?
University of Chicago historian William H. McNeill spent the early 1970s
studying the impact epidemics had on human history since the beginning of recorded time, and then reversed his query to ask which human activities had prompted the emergence of the microbes. In 1976, his book
Plagues and Peoples
40
created a sensation in academic circles because it argued with the force of centuries of historical evidence that human beings had always had a dramatic reciprocal relationship with microbes. In a sense, McNeill challenged fellow humans to view themselves as smart animals swimming in a microbial sea—an ecology they could not see, but one that most assuredly influenced the course of human events.
Like Evans, McNeill saw stages over time in human relations with the microbes, but he linked them not so much to economic development as to the nature at any given moment of the ecology of a society. He argued that waterborne parasitic diseases dominated the human ecology when people invented irrigation farming. Global trade routes facilitated the spread of bacterial diseases, such as plague. The creation of cities led to an enormous increase in human-to-human contact, allowing for the spread of sexually transmitted diseases and respiratory viruses.
Over the long course of history, McNeill said, pathogenic microbes sought stability in their relationships with hosts. It was not to their advantage to wipe out millions of nonimmune human beings in a single decade, as happened to Amerindians following the arrival of Columbus and Cortez. With the Europeans came microbes to which the residents of the Americas had no natural immunity, and McNeill estimated, “Overall, the disaster to Amerindian populations assumed a scale that is hard for us to imagine. Ratios of 20:1 or even 25:1 between pre-Columbian populations and the bottoming-out point in Amerindian population curves seem more or less correct.”
41
This was not an ideal state for the microbes, he argued, because such massive death left few hosts to parasitize. After centuries of doing battle with one another, humans and most parasites had settled into a coexistence that, if not comfortable for humanity, he argued, was rarely a cause of mass destruction. Still, he sternly warned, “no enduring and stable pattern has emerged that will insure the world against locally if not globally destructive macroparasitic excesses.”
Other historians of disease had tried to link the emergence of epidemics to the social and ecological conditions of human beings,
42
but none had presented as lucid an argument as McNeill's, and it promoted widespread reappraisal of both historic events and contemporary public health policy.
Nobel laureate Sir MacFarlane Burnet was moved from his perspective as an immunologist to issue similar warnings about humanity's overconfidence. True, he said, vaccines and antibiotics had rendered most infectious diseases of the Northern Hemisphere controllable. But, he cautioned, “it is almost an axiom that action for short-term human benefit will sooner or later bring long-term ecological or social problems which demand unacceptable effort and expense for their solution. Nature has always seemed
to be working for a climax state, a provisionally stable ecosystem, reached by natural forces, and when we attempt to remold any such ecosystem, we must remember that Nature is working against us.”
43
The policy implications were clear, Burnet said. Start by looking at the ecological setting of disease transmission. If the ecology could be manipulated without creating some untoward secondary environmental impact, the microbe could be controlled, even eradicated.
René Dubos, who served in the 1970s as a sort of elderly patron saint of disease ecology because of his vast contributions to research on antibiotics and tuberculosis during the pre-World War II period, also favored an ecological perspective of disease emergence, but laid most of the blame for epidemics on
Homo sapiens
rather than on the microbes. In Dubos's view, most contagious disease grew out of conditions of social despair inflicted by one class of human beings upon another. Dubos believed tuberculosis, in particular, arose from the social conditions of the poor during Europe's Industrial Revolution: urban crowding, undernutrition, long work hours, child labor, and lack of fresh air and sunshine.
“Tuberculosis was, in effect, the social disease of the nineteenth century, perhaps the first penalty that capitalistic society had to pay for the ruthless exploitation of labor,” Dubos argued.
44
For Dubos, unbridled modernization could be the enemy of the poor, bringing development and freedom from disease to the elites of societies, but consigning their impoverished citizens—particularly those living in urban squalor—to lives of microbial torture.
“The greatest strides in health improvement have been achieved in the field of disease that responded to social and economic reforms after industrialization,” he wrote.
45
He strongly felt that infectious diseases remained a major threat to humanity, even in the wealthy nations, and warned physicians not to be fooled into complacency by what he termed “the mirage of health.”
At the University of Birmingham in England, Thomas McKeown led a team of researchers who reached the conclusion that rapid urbanization, coupled with malnutrition, was the key factor responsible for the great epidemics of England and Wales from medieval times to the beginning of the twentieth century. Conversely, McKeown credited improvements in access to nutritious food for England's lower classes with at least half the reduction in premature mortality in the country between 1901 and 1971, and insisted that the bulk of all improvements in survival preceded the advent of modern curative medicine.
46
McKeown based his assertions on a meticulous scanning of English and Welsh government medical records maintained over the period, which indicated that premature mortality rates decreased radically before the age of antibiotics.
 
Joe McCormick had heard it all, argued one position or another over beers with CDC colleagues, and recognized grains of truth scattered through
each position, from the World Bank to the angry socialist dependency theorists. But all the hand-wringing and theorizing wasn't going to provide the resources needed to get rid of Lassa.
For nearly three years he had been tramping around West African villages testing residents and rats for Lassa virus infection. By 1979 McCormick had reached the conclusion that Lassa was an entrenched endemic disease, causing thousands of cases of illness of varying degrees of severity each year. The only way to rid Sierra Leone of human Lassa cases would be to eliminate contact between the rats and humans—an option he considered doable if millions of dollars were spent improving the country's rural housing and hospitals.
The alternative was mass education about rat avoidance and ribavirin therapy for those who suffered Lassa fever. That prospect was also orders of magnitude too expensive for the impoverished state.

Other books

Gray (Book 3) by Cadle, Lou
Loose Ends by D. D. Vandyke
Helluva Luxe by Essary, Natalie
Naked by Francine Pascal
Labyrinth by A. C. H. Smith
Bit the Jackpot by Erin McCarthy
Broken Fairytales by Alexander, Monica
Take Another Little Piece of My Heart: A Groupie Grows Up by Des Barres, Pamela, Michael Des Barres
My Darrling by Krystal McLean