Many hospitals approach safety improvement the wrong way

The time has come to drive a stake through the heart of an oft-repeated assertion.

How often have you heard something like the following when those of us in healthcare who want to stimulate quality and safety improvements draw analogies to the airline industry?

“Well, in an airplane, the pilot has an extra incentive to be safe, because he will go down with the ship. In contrast, when a doctor hurts a patient, he gets to go home safe and sound.”

At a recent talk to medical residents, medical students, and nurses in training, Terry Fairbanks (Director of the National Center for Human Factors Engineering in Healthcare) put the opposing case forward.  He noted, “No pilot follows safety rules and procedures because he thinks he is otherwise going to crash.”

Likewise, I would note, no doctor fails to follow safety rules and procedures because s/he does not care about the well-being of a patient.

What is the difference, then?  Terry summarizes, “There is a pervasive safety culture and set of rules that guides airplane pilots, based on a human factors approach.”

He added, “The relative degree of accountability (compared to other industries) is not the underlying cause of medical errors.”

Being in the human factors business, Terry is a whiz at the physical conditions and cognitive errors that bring about harm, and also at the interventions that can help reduce them.  He notes that most errors are skill-based errors, or errors that occur when you are in automatic mode, doing tasks that you have done over and over — indeed tasks at which you are expert.

He explains, “When you are in skills-based mode, you don’t think about the task you are about to do. Signs don’t work! Education and labeling don’t work when you are in skills-based mode. Most medical errors are in the things we do every day.”

Accordingly, vigilance and training are not the answer to skill-based errors. Neither is punishment: ”While discipline and punishment has a role when there is reckless behavior, applying discipline to skill-based errors will drive reporting underground and will kill improvement.”

Many hospitals approach safety improvement in the wrong way because “most safety-based programs are based on work as imagined, not work as actually done. We need to design our improvement based on real work, not on the way managers believe how work is done.”

Interestingly, Terry asserts,”If we just focus on adverse events, we will not make significant progress on creating a safer environment.”

Also, he warns: “Don’t base safety solutions on information systems. Humans add resilience: Computers do not adapt.”

The airlines have noticed this and have adopted solutions that are attuned to cognitive errors.  Recall this summary from Patrick Smith:

We’ve engineered away what used to be the most common causes of catastrophic crashes. First, there’s better crew training. You no longer have that strict hierarchical culture in the cockpit, where the captain was king and everyone blindly followed his orders. It’s team oriented nowadays. We draw resources in from the cabin crew, people on the ground, our dispatchers, our meteorologists, so everyone’s working together to ensure safety.

The modernization of the cockpit in terms of materials and technology has eliminated some of the causes for accidents we saw in the ’70s into the ’80s. And the collaborative efforts between airlines, pilot groups and regulators like the Federal Aviation Administration and the International Civil Aviation Organization, a global oversight entity, have gone a long way to improving safety on a global level.

Here’s more about the Commercial Aviation Safety Team, through which virtually anyone who sets foot in an airplane, touches it, or monitors its travel is expected and empowered to submit a report about potential safety hazards.

In summary, it is not the personal risks faced by doctors compared to pilots that kill and harm patients.  It is the fact that the kinds of solutions needed in health care are just at the gestational stage. Facile comments that doctors don’t care as much as pilots are just plain wrong and divert attention from the steps that can and should be taken to learn from the airline industry.

Paul Levy is the former president and CEO, Beth Israel Deaconess Medical Center and blogs at Not Running a Hospital. He is the author of Goal Play!: Leadership Lessons from the Soccer Field and How a Blog Held Off the Most Powerful Union in America.

email

Comments are moderated before they are published. Please read the comment policy.

  • pmanner

    I’ve never understood this analogy.

    When pilots fly, they’re using equipment which has been checked repeatedly. The smallest computer fault or glitch is enough to postpone or cancel a flight.

    In contrast, physicians are working on patients, who, if they were planes, would be permanently grounded.

    If this analogy were accurate, pilots would be told to fly regardless of weather, and be expected to achieve the same result at the same cost regardless of whether the plane was a 747 or a Sopwith Camel.

    • wahyman

      Pilots also don’t work long hours and double shifts, they have a co-pilot, and they are being continuously watched from the ground.

      The plane can also be viewed as the environment rather than the patient. From this perspective pilots are extensively trained on that specific set of equipment, and if the technical environment is not up to par, they don’t fly. And as noted they use check lists routinely whereas check lists seem to be found to be demeaning in healthcare.

      The pilot at risk analogy is not meant to explain everything, as most analogies are not. Nor is actively thinking about who is at risk the same as it being an underlying motivator.

      If you don’t like the analogy fine–now go do something about medical error.

      • pmanner

        “…go do something about medical error.”

        Easy, there, Sparky.

        First, of all, checklists have been used for years in surgery. Not sure why you believe that they’re “found to be demeaning,” but they are absolutely routine.

        The problem is not that checklists aren’t used. It’s that they’re minimally effective. The SCIP guidelines (Surgical Care Improvement Program), which focused on process
        improvement rather than outcomes, have been ineffective despite governmental support, financial penalties for non-compliance, and consequent widespread implementation. In some cases, they make outcomes worse.

        Second, using the metaphor/analogy/whatever of a linear process, like aviation, is inherently not applicable to a fundamentally chaotic process like physiology. Planes obey the laws of physics. Patients often appear not to.

        Third, we do not have a firm idea of what a medical error is. Is it an error of commission? Is it an error of omission? Is it simply an unexpected or unexplained bad outcome? VTE after joint replacement has been deemed a “never event,” in spite of the fact that it is a well-recognized risk, that the risk of prophylaxis approaches that of the event, and that it occurs despite prophylaxis.

        Last, we have no good idea of how many adverse events occur. Yes, we have the Institute of Medicine, “to err is human,” and so forth; their estimates are based on two retrospective chart reviews. One was in 1984 in Colorado and Utah, the other in 1993 in New York.

        The point is that ensuring patient safety is difficult and complex. Stamping your foot and saying “do something” is unlikely to do much.

        • wahyman

          Check lists are ineffective in part because people don’t take them seriously. Just another nonsensical rule to rush through.

          A “well-recognized risk” does not mean that the effect is not preventable. Accidents are a well-recognized risk of drunk driving.

          And I didn’t stamp my foot, anyway. The point obviously was that criticizing analogies doesn’t get us anywhere.

        • EmilyAnon

          If you take all those safety precautions off the table, and strike out any definition of ‘adverse events’, what’s left to give the patient some peace of mind that they won’t be harmed by someone’s negligence.

          • PoliticallyIncorrectMD

            So, should we continue to use the tools which we know make no difference just to create an illusion of safety?

          • EmilyAnon

            I have no idea. I’m not a doctor. That’s why I asked the question. So what safeguards do you suggest that will make a difference?

Most Popular