Authors: John Abramson
Instead of ordering an x-ray or MRI and starting her on Celebrex or Vioxx, we decided on a more practical approach: swimming as her only exercise until her knee started to improve; taking a low dose of an over-the-counter anti-inflammatory drug that would provide maximum or very near maximum pain relief and be much less likely to upset her stomach than a full dose; taking glucosamine and chondroitin sulfate daily, which would not start to help for a month or two but had a good chance of providing relief from the pain and making her knee somewhat more resilient to the trauma of exercise. I advised her to call me in one week if her knee pain and swelling were not improving. At that point she might benefit from my withdrawing the fluid and injecting some steroid into her knee joint to quiet down this acute episode more quickly (though this still would not allow her to resume her walking where she left off).
I told Mrs. Martin that I thought the fundamental problem with her knee was that she was asking it do too much. She was proud of and committed to her walking, but as soon as the knee problem was framed as a consequence of her commitment to her health and sense of well-being, she was willing to search for other ways to achieve the same goal. Swimming, though not her favorite activity, would allow her to keep up her exercise even while her knee was acutely inflamed. Once the swelling and pain resolved, she could begin a routine of cross-training. Activities such as bicycling and using an elliptical trainer machine would allow her to get just as much exercise without the repetitive trauma to her knees caused by walking.
The temptation to order an x-ray or MRI and prescribe the latest arthritis medicine for a patient like Mrs. Martin is great. She would believe that she was getting the best care, and I (or any doctor) would believe that I had done my job well. End of story, next patient. But the truth is that pictures of Mrs. Martin’s knee were very unlikely to help it get better any sooner, and no amount of Celebrex or Vioxx was going to allow Mrs. Martin to resume walking enough to control her anxiety. There was plenty of time for other diagnostic tests if her symptoms did not respond to these simple measures.
In the second half of the nineteenth century, medical science took a giant leap forward. Microbiology, the study of infectious microorganisms, or germs, began shortly after
Louis Pasteur accepted a position
as chair of the department of chemistry at the University of Lille, in the north of France. The local industry relied upon the precise harnessing of fermentation in the production of beer and wine, and the making of alcohol from beet juice. Pasteur’s work on the industrial problems associated with fermentation led to the discovery that fermentation was caused by live organisms. Pasteur also discovered the difference between yeast, which appeared round when viewed through a microscope and turned out to be essential for fermentation, and
bacteria, which appeared rod-shaped
under the microscope and turned out to be responsible for “souring” the beer. In 1865 he turned his attention to the epidemic that was
devastating the silkworm industry
in France, and discovered that a bacterial infection was responsible for the silkworms’ failure to spin silk cocoons or reproduce.
In 1877 Pasteur showed that a germ, anthrax, was the cause of a disease that was killing cattle. He went on to develop an anthrax vaccine, made from a weakened strain of the disease that caused only a mild illness, but then protected vaccinated sheep and cattle from getting the full-blown disease. A few years later, a 9-year-old boy, Joseph Meister, was brought to Pasteur, not yet sick but doomed to die within four to eight weeks after having been bitten many times by a rabid dog. At the time, Pasteur was
working on a rabies vaccine
made from infected spinal cord tissue taken from rabid rabbits. The specific infectious agent had not yet been identified because it was a virus—too small to see through a microscope and not amenable to being grown by the same techniques that Pasteur had used successfully to grow bacteria. Nonetheless, Pasteur was able to produce weakened “germs” by air-drying the infected tissue for two weeks, then incorporating this dried tissue into an injectable vaccination. In theory, injection of the weakened germs would evoke enough of an immune response to prevent the real rabies infection, which is universally fatal, from taking hold. Immunization with inactivated “germs” had never before been tried on a human, but this boy was sure to die without treatment. So, despite Pasteur’s self-described
“acute and harrowing anxiety,”
he gave the boy a series of 12 injections of the experimental vaccination. Joseph Meister never became ill.
Over the following 15 months,
Pasteur went on to treat 2490 people
who had been bitten by rabid animals. Only one person died. The dramatic success of the rabies vaccine led to the creation of the Pasteur Institute, still one of the world’s great institutions of medical research. Joseph Meister went on to become a gatekeeper there. (Tragically, in 1940, 55 years after receiving his lifesaving treatment, Joseph Meister took his own life rather than accede to an order issued by invading German soldiers to open Pasteur’s burial crypt.)
While Pasteur was making so much progress in France,
Robert Koch
, a German physician, was putting the finishing touches on the germ theory of disease. In 1882, he reported that the cause of tuberculosis, the tubercle bacillus, could be identified by looking at infected tissue under the microscope. And further, he found that the organisms that cause tuberculosis could be grown in culture, produce disease when injected into laboratory animals, be extracted from these infected animals, and be grown again in culture. These four steps, known as “Koch’s postulates,” became accepted as proof that a specific organism was the cause of a specific disease. Using these postulates, Koch went on to identify the microorganisms that cause cholera, typhoid, and diphtheria. Another German physician, Paul Ehrlich, who had worked as Koch’s assistant, identified the organisms that cause malaria and sleeping sickness. It was Ehrlich who coined the term
“magic bullet”
in the search for drugs that would block microorganisms from causing disease but not injure healthy tissue. In 1909 he discovered a treatment for syphilis, Salvarsan (an arsenic-based compound), which was soon used worldwide and represented a giant step forward for the German pharmaceutical industry.
Meanwhile in the United States,
Johns Hopkins University
started a medical school in 1893 that would set a new standard of medical education, according to Paul Starr’s extensive historical account,
The Social Transformation of American Medicine.
This was the first medical school with a four-year curriculum, and students were required to have completed four years of college before enrolling. Starr explains that the divide soon widened between universities such as Johns Hopkins and Harvard, which were incorporating the scientific basis of medicine into their medical school curricula, and schools with far lower standards that were basically run for the financial gain of the faculty. A study done by the
American Medical Association in 1906
showed that too many poorly trained doctors were being turned out by substandard medical schools (and providing a bit too much competition for established doctors, as well). The AMA was unable to take any action against the inferior schools, however, because its professional code of ethics prevented doctors from criticizing others in the profession publicly.
To avoid this problem, the Carnegie Foundation funded a study of medical education in the United States, which was conducted by Abraham Flexner, a graduate of Johns Hopkins with a degree in education, not medicine. The doors of all 131 American medical schools were opened widely for inspection by Flexner and the AMA representative who accompanied him. The schools assumed that Flexner brought with him the possibility of financial support from the Carnegie Foundation. When the
Flexner Report
was completed in 1910, most of the schools found that his mission had been quite different. Flexner’s report concluded that the low quality of medical education in the majority of medical schools was depriving society of the great progress that was being made in medical science. The report recommended that all but 31 of the medical schools in the United States be closed because they were providing substandard medical education. In the end, about half the schools survived, cutting the number of graduating doctors by more than half.
The Flexner Report marked the beginning of the modern era of scientific medical education. One of the most important changes was that medical schools were taken over by full-time scientists, medical researchers, and academic specialists, diminishing what Starr calls American medicine’s “practical” orientation. Subsequent to the report, the Rockefeller and Carnegie Foundations became the primary sources of funding for university-based research. The majority of Rockefeller’s General Education Board grants went to seven of the top medical schools, ensuring that an orientation toward research was a key element in the prestige of a medical school, and setting the stage for what evolved into the growing synergy between universities and the pharmaceutical and other medical industries.
Support for the reforms suggested by Flexner was not unanimous, however. Sir William Osler, recognized as the greatest clinician of his time, had grave reservations. (Osler had been the first professor of medicine at Johns Hopkins, but left in 1905 to accept a professorship in Oxford, England.) After learning of the content of the Flexner report,
Osler wrote a letter
to the president of Johns Hopkins expressing his concerns:
I am opposed to the plan as likely to spell ruin to the type of school I have always felt the [Johns Hopkins] hospital should be.. . . The ideals would change, and I fear lest the broad open spirit which has characterized the school should narrow, as a teacher and a student chased each other down the fascinating road of research, forgetful of those wider interests to which a great hospital must minister.
According to Starr,
Flexner himself eventually became disappointed
by the inflexible scientific orientation of medical education that followed his report, which, he came to realize, stifled students’ creativity. Flexner’s good intentions had laid the groundwork for the specialty- and research-dominated system of medical education that still stands—the more prestigious the school, the greater the emphasis on biomedical research and the less the emphasis on pragmatic medical care.
Almost 100 years later, the scientific principles outlined in Flexner’s report still dominate the training of American doctors. Before they’ve even entered medical school, medical students’ understanding of the tasks of medicine is already well established. They have studied and excelled in biology, biochemistry, and physics; and they understand that the forefront of medical science lies in the discoveries that are based in these disciplines. This is the stuff of real medicine. The tools of healing that they want to learn about are blood tests, electrocardiograms, x-rays, MRIs, drugs, surgeries, and scopes of all kinds.
Doctors in training work in the teams that are frequently seen making rounds in teaching hospitals. These teams are made up of students and doctors at all levels of training, the most senior member being the attending physician, usually a faculty member of the medical school. Each team represents a microcosm of medicine’s hierarchy of knowledge, experience, and authority. Over the course of seven or more years of medical training, doctors progress from team novice to one of the senior members, with increasing responsibility for the medical care provided.
One or more team members are on call each night, usually one first-year resident (formerly called an intern) and a third- or fourth-year medical student, if there is one on the team. They admit new patients and are available to respond to the medical needs of the patients being cared for by the other members of the team who are not on call. This can involve anything from getting called for a routine order for a laxative to evaluating a fever or chest pain to becoming involved in a true life-and-death emergency with all the tension and drama depicted so well on the TV show
ER.
Often the students and residents on call get little or no sleep, as their hours are filled with “working up” patients: recording medical histories, doing exams, getting blood work, looking at electrocardiograms and x-rays, doing urgent procedures, talking to family members and more senior doctors, and whatever else needs to be done. As a first-year resident, I remember feeling fortunate if I had enough time to brush my teeth before morning rounds began and was truly grateful if I had enough time for a shower.
During morning rounds, the students and residents who were on call the night before present the new admissions and unexpected developments on the patients they were covering to the rest of the team. These formal public presentations are among the defining moments of medical training. Exhausted from being up all night and having done their best to take good care of their patients, trainees describe their patients’ medical problems, the tests that were ordered, the results that are available, and the decisions that were made about therapies, consults, and procedures. All of this has to be backed up by the best scientific evidence available, often including reference to or actual copies of the latest articles from the medical journals.
Presenters are vulnerable to criticism of their medical care by more senior team members, vulnerable to getting grilled on their medical knowledge (the medical students call this “getting pimped”), and at risk of public humiliation. Rarely does this happen, but just knowing that it can is both highly motivating and highly intimidating. This system is elegantly designed to allow young physicians to be given progressively more responsibility while ensuring that proper standards of medical care are maintained. I know that I learned the most about being a doctor during the wee hours of the night when, tired and working alone, I had to decide for myself whether I was really sure about the medical care that I was about to render or whether I should wake up my senior resident to double-check my plan (and—truth be told—learn from my mistakes). The reaction to my night’s work on morning rounds was the public measure of my competence as a doctor, which at that stage of training, life, and fatigue, was also the measure of my self-esteem.