Though some pediatricians and policy makers found the 1989 numbers
worrisome, nobody forecast an epidemic. Measles epidemics were considered Third World problems by 1989.
But an epidemic did occur. The incidence of measles in the United States leapt by 50 percent between 1989 and 1990. More than 27,000 U.S. children, half of them under four years of age, contracted measles during 1990; 100 died of the disease.
Hardest hit was New York City, with 2,479 reported measles cases.
CDC investigators were baffled by the severity of illnesses in the 1990â91 epidemic.
“These kids are much sicker, and death rates are definitely higher,” the CDC's Bill Atkinson said. “We don't know whether it's because the strain of measles out there is more virulent, or the kids are more susceptible.”
Many of the ailing children, particularly in New York City, had never been vaccinated. They hadn't even received their primary shots, much less boosters.
“Now the majority of cases are in unvaccinated children,” Dr. Georges Peter, chair of the American Academy of Pediatrics, said. “Measles is the most contagious of all the vaccine-preventable diseases. The nature of the problem has clearly changedâit is undoubtedly a failure to vaccinate. And what this really is, is indication of a collapse in the public health system, of lack of access to health care.”
What was going on? Were parents deliberately keeping their children away from doctors? Were Americans suddenly phobic about immunizations?
The answers, it turned out, could be found in the demographics of the population of children with measles. The vast majority lived in large citiesâNew York, Chicago, Houston, Los Angelesâand were nine times more likely to be African-American or Hispanic.
As the epidemic persisted in 1991, worsening in New York City's African-American and Hispanic populations, it was evident that the microbe had successfully emerged in populations of poor urban people with little or no access to health care. This underlying social weakness also facilitated surges in whooping cough and rubella cases during 1990â93.
126
In 1978 the U.S. Surgeon General had declared that measles would be eradicated from the country by 1982, and an ambitious immunization campaign was mounted. By 1988, however, conditions of poverty, health care collapse, and public health disarray had grown so acute that the United States had a poorer track record on
all
childhood vaccination efforts than did war-torn El Salvador and many other Third World countries.
127
In some inner-city areasânotably in New York Cityâonly half of all school-age children had been vaccinated. For much of the urban poor in America the only point of access to the health care system was the public hospital emergency room. Families spent anxious, tedious hours queued up in urban ERs because they felt that they had no choice: there were no clinics or private physicians practicing in the ghettos, few alternative sources of basic care. But few poor families were willing to put up with a
daylong line in the ER simply to get their children immunized, particularly if it meant loss of a day's pay.
128
Further study of the measles crisis revealed that some deaths and many casesâindeed, most at the key hospitalsâwent unreported. The city of New York uncovered up to 50 percent underreporting in the region's largest inner-city hospitals during the 1991 epidemic. It was possible that up to 5,000 cases of the disease occurred in New York City, though only half that number were officially reported.
129
In 1993, World Health Organization adviser Dr. Barry Bloom, of the Albert Einstein School of Medicine in the Bronx, announced that the United States had fallen behind Albania, Mexico, and China in childhood vaccination rates.
130
At the World Summit on Children convened by the United Nations in September 1990, the Bush administration was in the dubious position of having, on the one hand, to pledge sweeping concern for the health and survival of the world's children while hoping no one would publicly note that the health status of America's impoverished kids rivaled that of children in much of Africa and South Asia.
“This society is so wealthy, obviously this country is better off than the Third World. But this country should be ashamed of the child mortality rates and health,” decried Jim Weill, of the Children's Defense Fund, at the Summit. “The U.S. ranks 19th in the world on infant mortality, 29th in low birthweight babies, 22nd on child mortality for children under five, and, perhaps most amazing, 49th in the world on child immunization, for our non-white children. We kill our children.
“Let's face it, when it comes to America's children we live in the Third World.”
Not only had America's cities sunk to Third World levels of childhood vaccination and access to health care, but its surveillance and public health systems had reached states of inaccuracy and chaos that rivaled those in some of the world's poorest countries.
131
Weill's words had barely been uttered when officials at the CDC acknowledged that America's public health system was also doing a worse job of handling tuberculosis than did many African nations.
Multiply drug-resistant TB had arrived. Microbes had emerged that were so broadly resistant to antibiotics that, in practical terms, they were invulnerable.
Tuberculosis didn't reemerge overnight in the United States. On the contrary, the new mutant microbes made numerous tentative incursions into the
Homo sapiens
population over a period of years. It wasn't a surprise attack.
It almost seemed as if human beings were deliberately ignoring the plentiful warning signs.
Â
Though tuberculosis had never disappeared, its incidence had declined steadily in the United States since the 1880s, and hit record lows following the introduction of antibiotic treatment. The robust
Mycobacterium tuberculosis
was impossible to eradicate, as half the world's population at any given time was infected with the bacteria. For most people
M. tuberculosis
infection was a benign event: the microbe was kept in check by the immune system and the individual never, throughout his or her life, fell ill.
On average, infected people had a 10 percent chance of developing active disease sometime during their lives, and a 1 percent chance of coming down with a lethal TB illness. Thus, statistics would indicate that about 2 billion human beings were infected with the microbe in 1988; 200 million would during their lives suffer tuberculosis and 2 million would die of the disease.
But those neatly averaged numbers belied the true nature of the risks of TB and the disease's extremely unequal distribution worldwide.
From the earliest days of Western tuberculosis research, scientists and physicians had recognized that the microbe moved hand in hand with poverty. Though there were famous cases of TB among more affluent individuals, most of the world's tuberculosis victims had always been the poorest citizens.
The nature of the association between TB and poverty was hotly debated throughout the nineteenth and twentieth centuries,
132
but the salient points were clear. The
M. tuberculosis
bacterium was, like its close cousin the
M. leprae,
which caused Hansen's disease, an extremely slow-growing microbe that under most circumstances spent its life either under attack from the human immune system or lying low, causing no disease. Its best hopes of vigorously reproducing, developing a large microbial colony within a human being, and causing disease lay with either a diminished host immune capacity or continuous reinfection of the human being.
Diminished immune systems were plentiful wherever
Homo sapiens
lived in squalor and poverty. Malnutrition played an important role, though chronic infections with other microbes, such as tropical parasites, influenza, and amoebas, were also factors. Any ailment that taxed the immune system could create opportunities for
M. tuberculosis
.
M. tuberculosis
exploited vulnerabilities. It was an opportunist. For decades it might silently lurk inside a
Homo sapiens
awaiting a moment when defenses were down, and then, when the victim's immune system was preoccupied with malaria or cancer, famine or pneumonia, it would strike.
It was also possible for people living in densely crowded situations to be continuously reexposed to the
M. tuberculosis
exhaled by others, which greatly increased their risk for developing an active case of the disease. That was why TB had historically been so strongly linked with urbanization and, in particular, slum housing and institutionalization.
Certainly it could have been predicted that the arrival of a new disease that produced severe immune deficiency and struck particularly hard in communities of poverty would spawn a reemergence of tuberculosis. If such communities had already been witnessing a slow, steady rise in TB cases, well before the new wave of immunodeficiency arrived, a resurgence of
tuberculosis seemed a virtual certainty, unless public health mitigating actions were taken.
In 1947, when antibiotic therapy for TB was still considered a novel treatment and disease prevention technique, 134,946 cases of tuberculosis were reported in the United States. By 1985 the uses of streptomycin, rifampin, isoniazid, and other antibiotics, coupled with an aggressive public health effort to identify and treat TB cases, had brought the U.S. caseload down to 22,201. Fewer than 30,000 Americans had actually contracted tuberculosis each year since 1977, and the majority were elderly individuals of European descent who had carried the
M. tuberculosis
microbes in their bodies for decades, only falling ill as their aging immune systems failed to keep the bacteria in check.
Well before the actual numbers of TB cases began to swell, the demographics of the disease shifted. Between 1961 and 1969 more than 80 percent of all active TB cases in the United States were among people over sixty-two years of age, most of them readily treated without hospitalization through basic long-term antibiotic therapy. During that time the U.S. federal government spent $69,287,996 on TB control.
133
Between 1975 and 1984, however, the numbers of active TB cases reported among elderly Americans and Caucasians of all ages declined sharply. White male cases dropped 41 percent, white female cases 39 percent. In contrast, though TB was declining across the board, its downturn among non-whites was far slower: only 25 percent for males and 26 percent for females. And the age distribution of cases had shifted: by 1984 only 29 percent were over sixty-two years of age. In the non-white population less than one out of every five active TB cases that year involved a person over sixty-two and fully 20 percent were between the ages of twenty-five and thirty-four.
134
As early as the mid-1970s, Lee Reichman, then head of tuberculosis control for New York City, was seeing a marked increase in active cases among injecting drug users and vagrants living in Harlem, most of them young men. Reichman's attempts to sound alarms about the new trend were muffled by a medical establishment that had already written TB off as a historical artifact.
135
There were other clear warning signs. Between 1980 and 1986 five different surveys documented a relationship between the rise of homelessness in America and surges of TB in young adult populations. The spread of tuberculosis within emergency homeless shelters was demonstrated, and it was even clear to the CDC by 1984 that new mutant strains of drug-resistant TB were spreading among the urban indigent.
136
A striking 1980 study of young adult men living in subsidized single-room occupancy housing for the otherwise homeless in New York City found that 98 of 101 came up positive in skin tests for TB infection, and 13, or 6 percent, had active disease as measured by laboratory analysis of their sputum. The 13 were carrying contagious pulmonary disease, meaning they could exhale the microbes onto others.
137
By 1986 nearly half of all active TB cases reported in the United States were among nonwhites, most of them African-American. There could be no doubt that dramatic changes were underway by the mid-1980s. Tuberculosis had clearly shifted to younger, predominantly African-American and urban populations. Geographically, it had shifted from areas such as Virginia to New York City, Miami, and scattered urban sites. The CDC itself noted the shift in 1986, which coincided with the first upward trend in TB cases reported in the United States since 1953. The agency also believed that “HIV infection may be largely responsible for the increase in tuberculosis in New York City and Florida.”
138
From the beginning of the AIDS epidemic, researchers in both the United States and Haiti had noted that HIV-positive Haitians had a high rate of tuberculosis. Indeed, published reports stated as early as 1982 that Haitians suffering from AIDS in Port-au-Prince were more likely to die of tuberculosis than of any other opportunistic infection. But American officials took little notice of this observation. Like their counterparts throughout the Western world, U.S. physicians tended to view the TB risk for people with HIV as a Third World problem.