Every Patient Tells a Story (31 page)

Read Every Patient Tells a Story Online

Authors: Lisa Sanders

Tags: #Medical, #General

BOOK: Every Patient Tells a Story
3.91Mb size Format: txt, pdf, ePub

Twining sent off blood samples to look for the origin of the anemia and for evidence of a recent exposure to mercury and arsenic. Other causes of
this neuropathy, she thought, were much less likely, and she could test for them later, if necessary.

The results of the anemia workup came back first. David had no evidence of sickle cell disease or any other congenital blood disorders. He had normal levels of iron and folate. But his level of vitamin B
12
was dangerously low: a tenth of the normal level. The doctor was sure this was the cause of David’s weakness, numbness, constipation, and anemia. It could even account for his chest pain and shortness of breath.

The cause of David’s anemia was proved by yet another blood test. He had an autoimmune disease that goes by one of those great nineteenth-century names: pernicious anemia. In this disease, the body’s own immune system mistakenly destroys the protein responsible for absorbing the vitamin from digested food and getting it into the blood. The immune system makes antibodies for this protein, just as if it were an invading virus or bacteria. David was started immediately on vitamin B
12
injections—he would have to take B
12
supplements the rest of his life. The results were dramatic and almost immediate.

“Every day I can feel myself growing stronger,” David told me when I called him not long after his diagnosis. One week after his first injection he was able to go back to work. “I can finally run again. I can pick up my daughter again. I can tell I’m going to get it all back.”

When Thinking Goes Awry

David’s story is an example of a diagnostic error. Researchers define diagnostic error as a diagnosis that is wrong, missed, or delayed. And although Powell didn’t suffer any permanent harm and has been restored to full health, it took four visits to the emergency room to get there.

David was lucky. Many studies show that diagnostic errors often exact a tragic toll. Diagnostic errors are the second leading cause for malpractice suits against hospitals. And a recent study of autopsy findings identified diagnostic discrepancies—a difference between the diagnosis given in life
and that discovered after death—in fully 20 percent of cases. The authors of that study estimate that in almost half of these cases knowledge of the correct diagnosis would have changed the treatment plan. Extrapolated to the millions of people in the United States alone who receive medical care every year, that 10 percent of diagnostic errors means a vast toll of avoidable suffering and death.

And patients are worried. One survey showed that over one third of patients surveyed after visiting an emergency room had concerns about medical errors and by far the greatest concern was the possibility that they had been misdiagnosed. They are right to worry. A recent review of the data reported that primary care physicians—those in family practice and internal medicine—had a diagnostic error rate that ranged from 2 to 10 percent. Up to one in ten patients seen was incorrectly diagnosed.

Of course that number only looks at single visits, and anyone who has been to the doctor for a complicated problem knows that it is often figured out over the course of several visits. Emergency room doctors have a somewhat higher rate of diagnostic errors, specialists a somewhat lower rate. This doesn’t mean that specialists are better doctors or emergency room physicians are worse. The uncertainty surrounding a diagnosis and thus the likelihood of error is greatest when a patient first presents with a problem—in an emergency room or a primary care office. By the time patients get to specialists much of the uncertainty about their diagnosis has been resolved.

There are many ways of getting a diagnosis wrong. In earlier chapters I looked at how each element of the medical data gathering can break down and lead to diagnostic mistakes—taking an inadequate history or performing an ineffective exam, or not examining the patient at all. A misreading or misinterpretation of a test can also derail the diagnostic process. But perhaps the most common type of diagnostic error—and the one that I will focus on in this chapter—is the one that takes place in the doctor’s head: the cognitive error, what I call in this chapter sick thinking. (Anyone interested in learning even more about this important issue should check out Jerome Groopman’s outstanding book on the topic,
How Doctors Think
.)

So how often is an error due to sick thinking? Mark Graber, a physician
and researcher at the VA Hospital on New York’s Long Island, wanted to answer that question. He collected one hundred cases of medical error from five hospitals over the course of five years. For each case, records were reviewed and, when possible, the doctors involved were interviewed within one month of the discovery of the error. These were serious errors. In 90 percent of the cases patients were harmed by the error; thirty-three patients died.

Graber divided the missed or delayed diagnoses into three categories. (The three categories overlapped somewhat; not surprisingly, most diagnostic errors were due to multiple factors.) “No-fault errors” are mistakes that happen because of factors beyond the control of the doctor making the diagnosis. When a disease presents in an unusual and uncharacteristic fashion—as when an elderly person with appendicitis has a fever but no abdominal pain—or when a patient provides incorrect information—as a patient with Munchausen syndrome might do—a diagnosis can be unavoidably missed or delayed. This was by far the smallest category of diagnostic error, present in only seven of the one hundred cases.

Graber found that our complex and often poorly coordinated medical system also contributes to diagnostic error. If a test result was not reported in a timely manner or if there were equipment failures or problems, he assigned the resulting diagnostic mistakes to the category of “system-related errors.” For example, a urinary tract infection might be missed because a urine sample was left too long before being cultured. Or a pneumonia might be missed because an overburdened radiology department hadn’t read a critical X-ray correctly. These were relatively common; more than two thirds of the errors Graber studied involved some component of system failure.

The issues that Graber was most interested in were what he called “cognitive errors,” by which he meant all errors due to the doctor herself. In his study, Graber attributed more than a quarter of all mistakes made, twenty-eight out of one hundred, to those made due to cognitive errors alone. Half of all the errors made were due to a combination of bad systems and sick thinking.

Graber broke his category of cognitive error down further. Which aspect
of cognition was at fault? Was it lack of physician knowledge? Not most of the time. Faulty knowledge was the key factor in only a few of the missed diagnoses, each of which involved a rare condition. Faulty data gathering—an inadequate history, missed findings on the physical exam, or misinterpreted test results—was a more common problem, playing a role in 14 percent of the diagnostic errors. Faulty synthesis—difficulty putting the collected data and knowledge all together—by comparison, played a role in well over half of the incorrect or delayed diagnoses.

In David Powell’s case, both the system and the doctors played a role. Early on in his illness David went to two different emergency rooms. Getting records from one ER to another can be a time-consuming affair. Often emergency physicians don’t even try to get them because the chances of obtaining them in time to help the patient are so small. So because David went to a different emergency room, his second visit was a virtual rerun of the first visit. And although the patient told the doctor who saw him at his second ER visit that he’d already been “ruled out” for a heart attack or myocardial infarction (MI), without the records to confirm it, the ER doctor repeated the studies rather than risk missing this important diagnosis.

Because the records were not available, David’s diagnosis was delayed. Graber would define this as a system-related error. Certainly, in an ideal world, a patient’s records should be readily available.

But the emergency room doctors were guilty of thinking errors as well. Each found that the patient was not having a heart attack but none, save the last, carried that train of thought to its next logical destination. None of them asked that most fundamental question in diagnosis:
what else could this be?
And because they didn’t, the diagnosis was missed.

They might have missed the diagnosis even had they asked such a question. The differential diagnosis for chest pain is long, and while this is a well-described symptom of pernicious anemia, the disease itself is relatively unusual. But they didn’t even try.

In medicine it seems that almost nothing that comes after the words “chest pain” is even heard. And if you are an adult male with chest pain, the odds are almost overwhelming that you are going to end up with a ticket on
what I have heard called “the MI express.” Far too often those words trigger the cascade of EKGs, blood tests, and even exercise stress tests in search of a heart attack—despite the presence of other signs and symptoms or workups that might suggest a different diagnosis.

Each of these doctors exhibited “premature closure”—one of the most common types of diagnostic cognitive errors. Premature closure is when a doctor latches on to a diagnosis and “closes off” thinking about possible alternative diagnoses
before
gathering all the data that would justify going down a particular diagnostic path. In David’s case, the doctors’ thinking was skewed by two factors: the fact that cardiac problems are so common in the ER, and the potentially dire consequences of a heart attack (which lends urgency and pressure to the task of diagnosis). The doctors heard David describe the classic symptom of a myocardial infarction—squeezing or pressurelike pain in the chest associated with shortness of breath—and began ordering tests and exams aimed at clarifying the suspected cardiac situation. In premature closure, “Thinking stops when a diagnosis is made.” The symptoms of weakness and numbness were noted in the chart in each of the visits but weren’t considered on their own even though they are not part of the typical chest pain presentation. When the “MI express” pulls out of the station, far too often everything that doesn’t fit—like David’s complaint about his loss of strength—is left behind.

Pat Croskerry is an emergency room physician and a doctor who has written extensively about diagnostic thinking. The brain, says Croskerry, uses two basic strategies in working to figure things out. One is what Croskerry calls an intuitive approach. This “nonanalytic” approach works by pattern recognition. He describes it as a “process of matching [a] new situation to one of many exemplars in your memory which are retrievable rapidly and effortlessly. As a consequence, it may require no more mental effort for a clinician to recognize that the current patient is having a heart attack than it is for a child to recognize that a four-legged beast is a dog.”

This is the instant recognition of the true expert described by Malcolm Gladwell in his book
Blink
—fast, associative, inductive. It represents “the power of thin slicing … making sense of situations based on the thinnest
slice of experience.” Intuition leads to a diagnostic mode that is dominated by heuristics—mental shortcuts, maxims, and rules of thumb. This is the diagnostic mode used by the emergency room doctors during David Powell’s first few visits to the emergency room with his chest pain and strange weakness.

Croskerry contrasts this almost instantaneous intuitive diagnostic thinking with a slower, more deductive approach to diagnostic thought. As described by Croskerry, this analytical approach is linear. It is a process that follows rules and uses logic to think a problem out. It’s the Sherlock Holmes model of diagnostic thought.

Croskerry believes that the best diagnostic thought incorporates both modes, with the intuitive mode allowing experienced physicians to recognize the pattern of an illness—the illness script—and the analytic mode addressing the essential question in diagnosis—what else could this be?—and providing the tools and structures that lead to other possible answers.

For Christine Twining, the doctor who finally diagnosed David Powell with pernicious anemia, there was no
Blink
-like moment of pattern recognition and epiphany when she first heard him describe his symptoms. One thing seemed clear: he wasn’t having a heart attack. She felt the patient’s fear and frustration. “He was afraid I was going to send him home with reassurances that it wasn’t his heart and without figuring out what it was. But I couldn’t send him home; I didn’t have a clue what he had.”

Because there was no instantaneous sense of recognition triggered by Powell’s odd combination of chest pain, weakness, and anemia, Twining was forced to approach the problem systematically, considering the possible diagnoses for each of his very different symptoms and pursuing a slower, more rational approach to the patient that ultimately brought her the answer.

Both types of thinking are essential in medicine. Which to use will depend on the perceived degree of uncertainty surrounding a set of circumstances. The more certainty there is in any given set, the more closely it aligns with some recognized or remembered disease state, the more likely you are to use the intuitive response. The cognitive continuum of decision making, says Croskerry, runs from informal/intuitive at one end to calculated/analytical
at the other, and the nature of the tasks runs from quite simple to complex. “The trick lies in matching the appropriate cognitive activity to the particular task.”

Much of the research that has been done on cognitive errors focuses on the misinterpretation of medical information. In David’s case, the doctors who missed his diagnosis of pernicious anemia focused on only a couple of his symptoms, ignoring the history of numbness and weakness, the abnormalities in his physical exam, even the anemia, in their concern not to miss a heart attack. But errors can also arise from interpretations of data we’re not even aware we are making, thanks to assumptions and biases that we bring with us from our lives outside the hospital.

Other books

Crystal Eaters by Shane Jones
The Fortune Hunters by J. T. Edson
The Galaxy Game by Karen Lord
The Book of Lies by Brad Meltzer
Resurgence by Kerry Wilkinson
Insatiable by Meg Cabot
Thirty-Eight Days by Len Webster
The Shadow Within by Karen Hancock