Diagnostic errors and their role in patient safety

American Medical News published an informative essay by Kevin B. O’Reilly on December 13, 2010, about errors in diagnosis and why doctors make them.

According to Gordon Schiff, MD, associate director of the Center for Patient Safety Research and Practice at Brigham and Women’s Hospital, “The problem of diagnostic errors has gotten short shrift in the broader patient safety movement.” The article focused on “thinking mistakes” as opposed to “system errors,” and was both refreshingly honest and depressingly true.

None of us is without error. We all make mistakes. Sometimes we can blame it on some fault of the “system,” but most often we have only ourselves to blame.  So if we back up a step and ask “What happened that I made that error for which I must now accept blame?” we begin to learn something about ourselves as physicians – and maybe even as attorneys, too.

But I’ll get to that in a moment.

Another recent article in the New England Journal of Medicine by Dr. David C. Ring has garnered  a lot of press. In it Dr. Ring recounts the time when he performed the wrong operation (carpal tunnel surgery) on a patient instead of the intended trigger finger release. While the scenario leading up to the error was evaluated in detail – communications errors, personnel changes in the OR, last patient of the day, etc. – all aspects analyzed seem to be superficial excuses. The article fails to mention the over-riding fact that the surgery schedule that day was simply too busy. The department was trying to operate – literally – at more than capacity. There was no margin. There was no time to regroup, to thoughtfully consider next steps, to assure that everyone was on the same page and all was in order.

Margin is crucial. That’s why emergency departments are such a hectic, potentially high risk area in which to work. The ED doesn’t have a “surge protector.” Staff can’t be scheduled for the maximum anticipated volume, but the average. Even then there is down time, and the better staffed the department, the more down time there is. Staffing to the average means that some days, there’s simply no margin, and it’s on those days where the opportunities for diagnostic error need to be monitored most closely.

Back to the American Medical News article.

Error occurs. About 5% of autopsies find clinically significant conditions that were missed and could have affected the patient’s survival, according to O’Reilly. Also, 40% of malpractice suits are for “failure to diagnose.” These are rarely “system errors,” like mis-filing a pathology report that a tumor was malignant, but more often “thinking errors.”

There are several reasons why we make mistakes in our thought processes, when we had the knowledge and ability to think correctly. As listed in a 2003 article in Academic Medicine, these “thinking errors” include:

  • Anchoring bias – locking on to a diagnosis too early and failing to adjust to new information.
  • Availability bias – thinking that a similar recent presentation is happening in the present situation.
  • Confirmation bias – looking for evidence to support a pre-conceived opinion, rather than looking for information to prove oneself wrong.
  • Diagnosis momentum – accepting a previous diagnosis without sufficient skepticism.
  • Overconfidence bias – Over-reliance on one’s own ability, intuition, and judgment.
  • Premature closure – similar to “confirmation bias” but more “jumping to a conclusion”
  • Search-satisfying bias – The “eureka” moment that stops all further thought.

The most fascinating and most common of these is “anchoring bias.” According to Dr. Schiff, “We jump to conclusions. We always assume we’re thinking about things in the right context, and we may not be. We don’t do a broader search for other possibilities.”

As thinking errors move to the forefront of patient safety, many medical schools are beginning to teach “metacognition,” or “thinking about thinking.” The busier the OR or the ER gets, the more this becomes important. It’s second nature to work up a chest pain patient for an MI when the waiting room is full, but more important than ever to keep a broader perspective and consider a couple other killers, for example pulmonary embolism and dissecting aneurysm.

Some experts say that information technology will help us overcome our biases, broaden our perspective and avoid diagnostic errors. Perhaps. But health IT has it’s own biases. Remember GIGO – garbage in, garbage out. A simple example is an over-reliance on “template charting,” whether electronic or in paper form. Let’s say the patient tells the triage nurse “I’ve been vomiting and my chest hurts.” If one chooses too early the template for “Vomiting,” “Gastroenteritis,” or “Abdominal Pain,” one could easily lead oneself and others astray, causing them to overlook the fact that what the patient really meant to say at triage was “I started having this heavy chest pain and have been vomiting ever since.” If the template is too focused, the patient may well be discharged with an undiagnosed MI – or worse.

“Thinking problems” can be at least partially avoided by simply being aware that they exist. And “metacognition” practiced by both physicians and attorneys can lead both to make fewer “diagnostic errors.”

Charles A. Pilcher is an emergency physician who has helped both plaintiff and defense attorneys with malpractice litigation for over 25 years. He can be reached at his self-titled site, Charles A. Pilcher, MD.

Submit a guest post and be heard on social media’s leading physician voice.

Comments are moderated before they are published. Please read the comment policy.

  • Marc Gorayeb, MD

    Thoughtful preliminary analysis. However, the subject is far more complex than you suggest. Cognitive biases should not be equated with “thinking errors.” The heuristics we employ to efficiently and successfully navigate to a diagnostic conclusion involve applying one or more cognitive biases, usually a pattern of cognitive biases. Cognitive biases exist because they lead to a correct conclusion most of the time. Eliminate them and you eliminate a core element of our professional abilities. What distinguishes experienced physicians from inexperienced physicians or non-physicians is the ability to select the cases that deserve a more thorough and complex evaluation than our heuristics would suggest. The only way I see information technology assisting us in this regard is through some form of artificial intelligence that is not yet on the technological horizon.

  • Anonymous Patient

    How often is there sufficient long-term follow up with patients to be sure that your cognitive biases lead to the correct diagnosis most of the time?

  • http://www.myheartsisters.org Carolyn Thomas

    Thanks Dr. Charles – especially for the “thinking errors” info from Academic Medicine.

    To Dr. Marc:
    “What distinguishes experienced physicians from inexperienced physicians or non-physicians is the ability to select the cases that deserve a more thorough and complex evaluation…”

    I’m a heart attack survivor who was misdiagnosed with GERD and sent home from the E.R. despite presenting with textbook symptoms like crushing chest pain, nausea, sweating and pain radiating down my left arm.

    GOOGLE could have done a better job at “a more thorough and complex evaluation” of my MI than the cognitive bias of my “experienced” middle-aged E.R. doc did. His rationale for GERD: “You’re in the right demographic!”

  • http://twitter.com/DrPlumEU David Lewis

    IMHO the problem described in operating theatres is that the surgeon needs to know her/his patients by sight AND do their own consent and clinical examination rather than depend on so many intermediaries.  Sounds impossible?  Yes, it is not practical while there is so much pressure to push so many units of work through operating theatres.

    Errors of thinking about problems is another matter – the ‘clinical problem solving’ series of New England Journal of Medicine helps clinicians think better about a person’s signs and symptoms.  However, pressures to stick with ‘guidelines’ rather than think properly about individual cases leads to errors like sending people home with diagnosis of GERD while in fact they are suffering MI.

Most Popular