The Coming Plague (42 page)

Read The Coming Plague Online

Authors: Laurie Garrett

BOOK: The Coming Plague
12.41Mb size Format: txt, pdf, ePub
A previously unrecognized disease did, however, sweep over Europe.
12
Leprosy seemed to follow the rise of European cities during the medieval period, reaching a peak sometime around 1200. Nobody was certain then, or now, exactly how the fussy, slow organism was passed from one person to another. It obviously required close contact, but may have originally been more easily transmitted among the then totally nonimmune
Homo sapiens.
Once in a person's body, however,
M. leprae
attacked the nerves and skin cells of cooler, peripheral parts of the body, causing them to go numb, weaken, and often be destroyed as a result of unfelt injuries. The disfigurement that resulted from loss of fingers, toes, ears, noses, and other external body parts marked “lepers” as targets for stigmatization and fear.
By 1980 most of the world's five billion humans had antibodies to
M. leprae,
proving they'd been exposed without apparent harm.
But in medieval Europe leprosy took a high toll and seemed to spread rapidly through the congested cities. Some biologists in the 1980s theorized
that factors unique to medieval urban life helped promote the mycobacterium's spread, including the lifetime avoidance of bathing, always wearing wool rather than cotton clothing, and the practice of sharing bedding to stay warm.
Whatever the case, European leprosy died out with the Black Death of 1346. Nobody was certain why this was so, but it was generally suspected that the Black Death decreased the human density of urban areas, thus reducing human-to-human contact. It may also have been possible that plague survivors' immune systems were less susceptible to a broad range of bacteria, including both
Yersinia
and
M. leprae.
Or conversely, those who were vulnerable to leprosy may have also been less able to respond to a range of other bacterial assaults.
In leprosy's place, exploiting the post-plague urban chaos, came tuberculosis. Unlike the leprosy bacterium,
M. tuberculin
was truly ancient, and clear evidence of its affliction of
Homo sapiens
dated back to at least 5000 B.C.
13
The disease was described by all ancient literate cultures, except those of the Americas, and archaeological evidence of bone damage predated the written descriptions of “consumption,” “phthisis,” or “tuberculosis,” as it was variously labeled. But the true impact of the disease wasn't felt until after the Black Death when, according to theories popular in the 1980s, the tuberculosis organism exploited human ecological niches vacated by
M. leprae
. An urban environment was not required for its transmission, but it was clearly advantageous.
The rise of European tuberculosis was not sudden. Like leprosy, the organism was fastidious and slow-growing, producing overt and highly contagious illness only after months, or years, of infection. While the fastgrowing plague bacteria could kill a human in a matter of hours, few
Homo sapiens
were felled by
M. tuberculin
without prior years of debilitating illness.
On the other hand, the bacteria could be spread by airborne transmission, assuring that humans sharing close quarters with an afflicted individual would be exposed. By the 1980s scientists knew that infection did not guarantee illness or death: about one out of ten infected individuals eventually developed the disease, and without twentieth-century treatments about half would die.
14
But conditions in European cities of the fifteenth to the seventeenth century were ideal for transmission of
M. tuberculin
, especially during the winter, when the practice was to shut all windows and huddle around a heat source. The microscopic droplets exhaled by a tuberculosis victim would drift continuously about the home.
The household might take steps to avoid exposure to visible droplets of coughed or sneezed tubercular material, but these were actually harmless. To take hold in the human body the bacteria had to be carried inside droplets small enough to pass through the barriers of the upper respiratory
tract. Such tiny droplets could remain suspended in the air, drifting on currents, for days, containing live, infectious tuberculosis germs.
15
There was only one thing seventeenth-century Europeans could have done to decrease their exposure to household tuberculosis: open the windows. One good flushing of the air could have purged 63 percent of the suspended infectious particles exhaled each day by an ailing resident, and the sun's ultraviolet light would kill those organisms it reached.
16
Medieval Europeans had no such options during the winter, however, particularly in northern latitudes. For poorer city dwellers especially, it was inconceivable to open windows during the winter, as fuel of any kind was scarce and extremely expensive. Europe's wood had been used to build her cities.
The rates of tuberculosis slowly but steadily increased. The hardest-hit cities were also the largest, London particularly. By the time London was devastated by the Great Plague and subsequent Great Fire, one out of every five of its citizens had active tuberculosis. And this time the plague had no purging effect on a mycobacterial epidemic: the rates of TB continued to climb long after the 1665 plague passed.
As European explorers and colonialists made their way to the Americas, they carried the deadly mycobacterium with them, adding to the disease burden of tuberculosis, which had already for centuries plagued the Amerindian population.
17
By the time the United States was torn asunder by the Civil War, tuberculosis was firmly entrenched in its northern cities, particularly Boston and New York City.
Between 1830 and the eve of the Civil War, Americans' life expectancy and death rates fell to the levels that existed in London. In 1830, with a population of 52,000 citizens, Boston's crude annual death rate was 21 per 1,000—half that of London at the time. By 1850, Boston's crude death rate nearly equaled London's, hitting 38 per 1,000. Tuberculosis wasn't the only responsible factor, but it was a major contributor. Cases of consumption, as it was called, increased every year in Massachusetts, rising 40 percent between 1834 and 1853.
18
The old families of New York City, Philadelphia, and Boston groaned in disbelief as their cities' populations swelled, filth abounded, and disease ran rampant. Immigration, the Industrial Revolution, crowded slums, no public water supply, moral decay, no sewage systems—these were but a few of the factors that the civic leaders blamed for their crises.
The Western world's urban crises peaked between 1830 and 1896, when Europe and North America suffered four devastating pandemics of cholera that spread primarily via the cities' fetid water and sewage “systems.” Though physicians of the day didn't understand why, quarantines didn't work for cholera, so the rich generally fled the cities at the first hint of the dreaded dysenteric disease, leaving the common folk to fend for themselves. It would be decades before scientists could prove that cholera was caused
by a bacterium that entered human bodies via contaminated food and water, and got into the water through the fecal waste of infected people.
The death toll from cholera in the nineteenth century due to waves of the disease was astonishing: 10 percent of the population of St. Louis in three months during the 1849 epidemic;
19
500,000 New York City residents in 1832; 8,605 Hamburg, Germany, residents in three summer months in 1892; 15,000 residents and hajj pilgrims in the city of Mecca, and some 53,000 Londoners, in 1847. The Mecca tragedy was repeated during the hajj of 1865, when 30,000 pilgrims to the city perished.
Though they had no idea what caused cholera, New York City authorities were appalled by the 1832 epidemic and blamed it on municipal filth. Reform followed. The Croton Aqueduct brought in clean drinking water for the first time, muddy streets were cobblestoned, and the squalid slums were slowly upgraded. As a result, subsequent waves of cholera took a minor toll.
Such was not the case in most other cities, however, where the connection between urban filth and disease remained a matter of vociferous debate among civic leaders. The fact that, without exception, cholera and other epidemic diseases—including tuberculosis—took their greatest toll among the most impoverished residents of the world's metropolises seemed only to reinforce the belief by those in power from Moscow to Madrid that lower-class “immorality” was the root of disease.
During London's devastating 1849 cholera epidemic, physician John Snow demonstrated that cholera was transmitted via water by removing the handle of the Broad Street pump, the sole water source for an impoverished and cholera-ridden community. The local epidemic, of course, came to a halt.
Authorities were unconvinced, however, so during London's 1854 epidemic Snow mapped cholera cases and traced their water supplies. He showed that those neighborhoods with little cholera were receiving water drawn from the upper Thames, while cholera-plagued areas drew their water from the lower Thames, which included human waste from upstream.
Snow failed to convince authorities directly of the need to clean up water supplies, but the epidemics of cholera and other devastating diseases spurred improvements in basic urban hygiene all over the industrializing world. Citizens' sanitary action groups cropped up in many cities, garbage and waste-disposal practices improved dramatically, outhouses were replaced by in-house toilet systems, and “cleanliness” became “next to godliness.”
Many urban diseases, including tuberculosis, declined in the cities of the Northern Hemisphere at about the same time as these social reform campaigns emerged. In addition to these changes in physical ecology, urban residents' lives were improved through such political and Christian reform efforts as elimination of child labor, establishment of public school systems,
shortening of adult workweeks, creation of public health and hospital systems, and a great deal of boosterism about “sanitation.”
At the peak of the Industrial Revolution, before such reforms were widely instituted, life in the cities had become so unhealthy that the birth rates were
lower
than the death rates. For a city like London this meant that the child and adult workforce of nearly one million people could be maintained only by recruiting fresh workers from the countryside. But by 1900 the birth rates soared, the death rates plummeted, and life expectancies improved markedly. Nearly all contagious diseases—including tuberculosis—steadily declined, reaching remarkably low levels well before curative therapies or vaccines were developed. In England and Wales, for example, the tuberculosis death rate dropped from a high of 3,000 per million people in 1848–54 to 1,268 cases per million in 1901 to 684 per million in 1941, just before antibiotic treatment became available.
20
A similar pattern could be seen for infectious diseases, particularly tuberculosis, in the United States. In 1900, TB killed about 200 of every 100,000 Americans, most of them residents of the nation's largest cities. By 1940, prior to the introduction of antibiotic therapy, tuberculosis had fallen from being the number two cause of death to number seven, claiming barely 60 lives in every 100,000.
21
By 1970, tuberculosis was no longer viewed as the scourge of the industrialized world's cities.
22
The World Health Organization then estimated that about 3 million people were dying annually of the disease, some 10 to 12 million were active TB cases, and with antibiotic therapy the mortality rate had dropped to about 3.3 deaths in every 100,000 TB cases. Most new infections were then occurring not in the industrial cities of the Northern Hemisphere but in villages and cities of the developing world. The microbe's ecology had changed geographically, but continued to be concentrated in urban areas.
23
The enormous decline of tuberculosis in the Northern Hemisphere was viewed as a great victory, even though at the time TB raged across Africa, Asia, and South America.
Why this apparent victory over a microbe had occurred—what specific factors could be credited with trouncing tuberculosis—was a matter of furious debate from the 1960s through the 1990s. Resolution of the debate could have been useful in two ways: in helping public health authorities anticipate problems in their cities that might promote the emergence or reemergence of infectious diseases in the future, and in guiding urban development in the Third World by identifying which expenditures—drawn from ever-shrinking national reserves—might have the biggest impact on their public's health.
But the waters were muddy. British researcher Thomas McKeown argued that nutrition was the key—improved diets meant working people could withstand more disease.
24
Rene Dubos was equally certain that it was
elimination of the horrendous working conditions of the men, women, and children of the Industrial Revolution, coupled with improved housing, that merited credit for the decline in TB.
25
Medical historian and physician Barbara Bates, of the University of Kansas, skillfully asserted that the bold TB control programs of the early twentieth century, sparked by German scientist Robert Koch's 1882 discovery of the
M. tuberculin
bacterium, which led to mandatory quarantines in medical sanitariums, had little or no impact on the decline of the disease.
26

Other books

To Feel Stuff by Andrea Seigel
The Prodigal Troll by Charles Coleman Finlay
City of Strangers by John Shannon
White Girl Bleed a Lot by Colin Flaherty
Treachery in Tibet by John Wilcox
Shadow Girl by Patricia Morrison
Under the Moons of Mars by Adams, John Joseph