The essential point, my friends emphasize, is that while we have become wonderfully efficient at controlling the symptoms and progression of the diseases they commonly encounter in their clinical practiceâheart disease, AIDS, stroke, depressionâwe have made much less progress in understanding the underlying
causes
of these diseases and, therefore, in being able to prevent or to cure them.
“The great problem with neurology, for example,” Phil says, “is that we don't know how to help the brain heal itself. All the other organs and body parts can heal themselves, and we can, to varying degrees, aid these processes better than we used to. But not the brain. Our knowledge of the brain is way behind our knowledge of, say, the heart and the circulatory system.”
My readings in the history of disease, and of medicine, confirm what my friends tell me. David Weatherall, professor of medicine at Oxford and a specialist in human genetics, puts matters this way: “While genuine progress has been made in our understanding of how to manage heart disease, stroke, rheumatism, the major psychiatric disorders, cancer, and so on, we have only reached the stage at which we can control our patients' symptoms or temporarily patch them up.”
Reiterating what Lewis Thomas wrote about half way technologies more than a quarter century ago, Weatherall continues: “Our lack of success over the last fifty years, in getting to grips with the basic causes of these diseases, combined with our increasing understanding of the pathological consequences of diseased organs, has bred modern high-technology medicine.
*
Much of it is very sophisticated and effective at prolonging life, but it neither prevents nor cures many diseases.”
Like my friends, Weatherall is acutely aware of the many gains we
have
made. “Our ability to patch up patients and to prolong their lives seems to be almost limitless,” he states.
*
Still, like my friends, he also notes a curious anomalyâthat the more we can do to enhance and prolong life for those suffering from diseases whose underlying causes we do not understand, the more this seems to lead to both a “dramatic increase in the cost of medical care” and “a dehumanizing effect on its practitioners and the hospitals in which they work.”
This dehumanizing effect on doctors and the institutions in
which they workâthe devaluation of those practices that enable a doctor to know and understand patients in their uniqueness, and, when treating their patients, to have the wherewithal and time to be thoughtful in their judgmentsâis not only lamentable for reasons we usually term personal or humanistic but, as my friends contend, profoundly inimical to the practice and efficacy of medicine itself.
“Insurance companies don't blink much when it comes to lab tests or procedures for my patients that cost hundreds of dollars,” Jerry explains. “But they won't pay, or will pay only minimally, for me to sit with a patient for a half-hour or an hour and take a thorough history, or to sit with a patient and talk about the patient's problemsâor the patient's
progress
. And if I don't know my patient well, and my patient doesn't trust that I know him and care about him, then I can't be the kind of doctor I want to be and should be.”
Although taking a thorough history and listening attentively to a patient, like caring for people with chronic and disabling diseases, or providing public health measures that help prevent disease, may not seem as heroic or glamorousâas
sexy
âas a bypass, a transplant, an artificial heart, a reengineered gene, a new “miracle” drug, or some other biotechnological innovation, it is prevention and rehabilitation that, these past hundred years, have made and continue to make the greatest difference in terms of both morbidity (how sick we are) and mortality (how long we live).
Before the advent of the Salk vaccine in 1955, the word
polio
would spread in whispers through our world each summer: news would reach us that a child we knewâa cousin or distant relative, the son or daughter of a friend, a boy or girl with whom we went to school or to summer campâhad contracted the disease, and would most likely be crippled for life.
In my memory, the polio scare (as in my parents' phrasing: “There's a polio scareâlet's just hope it doesn't lead to an epidemic!”) came regularly each year after school was out in June, and when it did, beaches and swimming pools were closed, I was warned to stay out of crowds, to wash my hands frequently, to be careful about the water I drank, and never to swallow water from a lake or
stream. This happened both in Brooklyn and in upstate New York, where my mother, brother, and I either went to sleepaway summer camps (my mother, in exchange for tuition for me and Robert, worked as camp nurse) or rented rooms in a large old farmhouse where there was a communal kitchen, and where my mother's brother and four sisters, with their children, also rented rooms.
I recall, too, an earlier timeâI was perhaps nine or ten years oldâwhen, in my elementary school, P.S. 246, we lined up in the auditorium, the boys in white shirts and red ties, the girls in white blouses and dark skirts, to be given vaccinations. Class by class, we filed down to the front of the auditorium, and one by one we rolled up our sleeves, stepped forward, and received our shots. This occurred a few years after World War II, and we were told that by receiving these injections without complaint or tears we were heroes too: young Americans mobilizing against treacherous enemiesâdisease, disability, and epidemicâin order to keep our nation healthy, strong, and free.
I remember watching my friend Ronald Granbergâa tall, broad-shouldered, red-headed boy chosen to lead the Color Guard and carry the American flag down the center aisle at the start of assembly each Friday morningâget his shot, take a drink from the water fountain to the right of the stage, and faint straightaway into the spigot, chipping a sizeable triangle from his right front tooth. Twenty years later, our teacher, Mrs. Demetri (who lived around the corner from me, and gave me oil painting lessons in her apartment at night), told me she met Ronald in the street one day when he was a grown man, and that they recognized each other immediately. “Open your mouth, Ronaldâ” Mrs. Demetri told me she commanded him first thing “âand show me your tooth!”
My friends and I grew up and came of age in a time when, as David Weatherall writes, “it appeared that medical science was capable of almost anything”âin a time when the diseases that throughout our parents' and grandparents' lifetimes had been the chief instruments of infant and childhood death, and of crippling lifelong disabilities, were disappearing.
In these pre-AIDS years, Jerry explains, citing the success, among
other things, of the worldwide program to eliminate smallpox, the medical community seemed to believe that infectious disease was, by and large, a thing of the past.
“When I was doing my internship, I was one of the few young doctors choosing to go into infectious disease,” he says. “I did it because I wanted to work in areas of our country and the Third Worldâpoor areasâwhere there was still work to do that might make a real difference, and where these diseases were still taking an enormous toll. For the most part, however, I was anomalous in my choice of specialty. Back then, infectious disease was certainly not considered a promising specialty for medical students and young doctors, either clinically or in terms of research.”
Optimism about the “conquest” of diseaseânot only the infectious diseases, but
all
diseasesâwas widespread. The surgeon general of the United States, William H. Stewart, was frequently quoted as having declared, in 1967, that “it was time to close the book on infectious disease,” and the sentiment was, Jerry confirms, widely accepted as a truism (even though the surgeon general, it turns out, never said it!) and has continued, despite the AIDS pandemic and the emergenceâand reemergenceâof other infectious diseases, to prevail.
*
In a more recent example, we have Dr. William B. Schwartz, in
Life Without Disease: The Pursuit of Medical Utopia
(1998), asserting that if “developments in research maintain their current pace, it seems likely that a combination of improved attention to dietary and environmental factors along with advances in gene therapy and protein-targeted drugs
will have virtually eliminated most major classes of disease”
(italics added).
*
More: a molecular understanding of the process of aging, he predicts, may lead to ways of controlling the process so that “by 2050, aging may in fact prove to be simply another disease to be treated.”
“The virtual disappearance overnight of scourges like smallpox, diphtheria, poliomyelitis, and other infectious killers, at least from the more advanced countries,” Weatherall writes about the post-World War II period, “led to the expectation that spectacular progress of this kind would continue.”
*
“But this did not happen,” he explains. “The diseases that took
their placeâheart attacks, strokes, cancer, rheumatism, and psychiatric disordersâturned out to be much more intractable.”
The more we were able to eliminate many of the infectious diseases that led to premature death, that is, the more chronic and degenerative diseases such as cancer and heart disease replaced them as our leading causes of sickness and death. In the 1880 federal census, for example, neither cancer nor heart diseaseâour major killers a hundred years laterâwas listed among the ten leading causes of death.
Throughout the nineteenth century, gastrointestinal diseases, especially among infants and children (manifested largely as diarrheal diseases), were the leading causes of death. By the end of the nineteenth century, in large part because of public health and public works projects (clean water, sewage, sanitation), deaths from gastrointestinal diseases had declined, and tuberculosis and respiratory disorders (influenza, pneumonia) emerged as the major causes of death.
In 1900, neoplasms (cancer) accounted for less than 4 percent of all deaths and ranked sixth as a cause of mortality, while diseases of the heart accounted for slightly more than 6 percent and ranked fourth.
*
Eleven years later, in 1911âthe year of my mother's birth (she was one of eight children, two of whom died in infancy)âwhen respiratory diseases and tuberculosis were still the primary causes of death, heart disease and cancer accounted for nearly 17 percent of total mortality.
From 1911 through 1935, mortality from tuberculosis declined steadily, and influenza and pneumonia became, and remained, the two leading causes of death, taking their highest toll among people forty-five years and older, while the figure for heart disease and cancer, combined, rose to 30.4 percent.
By 1998, however, cancer and heart disease had replaced pneumonia and influenza as our leading causes of death, diseases of the heart accounting for 31 percent and malignant neoplasms for 23.2 percent. Of the fifteen leading causes of death, only pneumonia and influenza (3.6 percent, combined) now fell directly into the infectious group, and they took their greatest toll largely from individuals afflicted with a variety of other health problems, many of them
deriving from what epidemiologists call “insult accumulation”âthe long-term effects of organ damage caused by the childhood illnesses these individuals had survived.
*
But we should note that diagnostic categories and criteria are, then as nowâespecially with respect to heart diseaseâever changing. “We didn't even know what a heart attack
was
until some time in the early years of the twentieth century,” Rich says. “It hadn't really been invented yetânot until James Herrick discovered and wrote about it, and it took a while for the medical community to believe
him.”
Until 1912, when Herrick published a five-and-a-half-page paper in the
Journal of the American Medical Association
, “Clinical Features of Sudden Obstruction of the Coronary Arteries,” the conventional wisdom was that heart attacks were undiagnosable, fatal events that could only be identified on autopsy. Although Herrick did not claim he was discovering anything new, his conclusions represented a paradigm shiftâa radically new way of thinking about old problems that called conventional beliefs into question.
By comparing symptoms of living patients to those who, after death, were found to have had blocked arteries, Herrick demonstrated that coronary artery disease was recognizable in
living
patients. At the same time, he offered evidence suggesting that a totally blocked major coronary artery, as in my case, need not cause death, or even a heart attack. He concluded that heart attacks were most likely caused by blood clots in the coronary arteries, and that some heart attacks were survivable.
“Unsurprisingly,” Stephen Klaidman writes, “no one believed him.
*
The old paradigm was not ready to topple. Herrick said that when he delivered the paper, âIt fell like a dud.'”
Six years later, in 1918, Herrick provided additional evidence to support his theory, including comparative animal and electrocardiograph tracings that identified the existence of blocked coronary arteries, and this time, Klaidman writes, “the livelier minds in the medical profession finally began to take notice.”
Although Herrick's theory remained the conventional wisdom from 1920 to 1960, at which time it began to be questioned, it was not until 1980 that another American physician, Marcus DeWood,
using a technique unavailable to Herrickâselective coronary angiographyâproved that it was, in fact, blood clots within the coronary arteries, and not the slow accretion of atherosclerotic plaque, that caused most heart attacks. Thus was Herrick's theory, nearly seventy years after he first proposed it, fully confirmed.
Thus, too, Rich contends, do we see how slowly and indirectly it is that we often arrive, in medicine, at the knowledge that allows physicians to be useful to their patients.