I was listening to the news on my way to work recently, and heard a story about the review conducted after the well-publicized security breach at the White House. Like many people, I was shocked when the story of the fence-jumper first broke. How was it possible that some guy with a knife managed to get over the fence, cross the lawn, enter the White House and get deep into the building before he was stopped?
The answer, according to NPR’s reporting of the Department of Homeland Security investigation is that a whole sequence of events made it possible:
It turns out that the top part of the fence that he climbed over was broken, and it didn’t have that kind of ornamental spike that might have slowed him down. Gonzalez then set off alarms when he got over the fence, and an officer assigned to the alarm board announced over the Secret Service radio there was a jumper. But they didn’t know the radio couldn’t override other normal radio traffic. Other officers said they didn’t see Gonzalez because of a construction project along the fence line itself.
And in one of the most perhaps striking breaches, a K-9 officer was in his Secret Service van on the White House driveway. But he was talking on his personal cell phone when this happened. He didn’t have his radio earpiece in his ear. His backup radio was in his locker. Officers did pursue Gonzalez, but they didn’t fire because they didn’t think he was armed. He did have a knife. He went through some bushes that officers thought were impenetrable, but he was able to get through them and to the front door. And then an alarm that would’ve alerted an officer inside the front door was muted, and she was overpowered by Gonzales when he burst through the door. So just a string of miscues.
The explanation rang true. Of course it was no one thing that went wrong; it was a series of events, no one of which in isolation was sufficient to cause a problem but, when strung together, led to a catastrophic system failure. The explanation also sounded familiar. It is a perfect example of the “swiss cheese” conceptual model of patient safety.
First articulated by Jim Reason the swiss cheese model holds that serious adverse events that occur in the context of complex systems are generally the consequence of multiple failures, not the fault of a single individual. In the case of a serious patient harm event (e.g., operating on the wrong body part), thoughtful analysis inevitably finds that many things have to go wrong for the surgery to occur.
Indeed, just as the Secret Service has multiple layers of barriers around the White House to prevent an intruder from reaching the president, patient safety experts speak of layers of defense within medical systems that are designed to assure that small errors caused by human frailty don’t allow harm to reach the patient.
The swiss cheese description derives from the visual shorthand of imaging a series of slices of swiss cheese, each of which represents a system defense. In the case of the White House, the perimeter fence, the guard dog and the building alarm are each like separate pieces of cheese. The holes represent imperfections or failures of each slice. For the intruder to get through them all, the holes in the cheese have to line up in a particular way. If the holes don’t line up — the fence fails, but the dogs respond — then the system works.
For a wrong side surgery to occur, it may take a similar string of failures: maybe the surgical drape covered the surgeon’s pre-op marking and the patient had bilateral disease, and the surgeon working in an unfamiliar OR, and so on.
Addressing patient (and presidential) safety is almost never about finding the single person who failed at his or her task, or about an easy fix. It is about understanding how complex systems work and creating a culture of safety to continuously improve them. I hope the Secret Service takes that approach, instead of just fixing the fence and firing the guy who was on his cell phone.
Ira Nash is a cardiologist who blogs at Auscultation.