Unlike most other industries, safety in healthcare has a unique set of circumstances to overcome: 1) patients die from errors one-at-a-time, and 2) we rarely find out exactly why. The “sensationalism” from a front-page disaster is completely absent. And, due to a myriad of factors, the underlying hazards usually remain a secret.
As we ride the crest of a current wave of interest in “patient safety,” is the refreshing new focus blinding us to the reality that safety has always been a difficult business? Even in industries that have a refined safety infrastructure, managing the risk of harm continues to be an ongoing effort with many obstacles. Because of safety’s inherent invisibility, the push for it in all industries would decay to zero without a continuous input of energy. What’s the best way to illuminate and fix the issues of safety in healthcare? Enlist the work of safety experts to design and manage safety into the healthcare system.
By its very nature, safety remains an enigma to most people. Safety is invisible. Not only is the result of good safety management remarkably inconsequential (ie: nothing happens!), but the components of un-safe operations – risk – are imperceptible to the untrained eye as well. For these reasons, risks develop and propagate within industrial processes, including healthcare, until these latent conditions cause an accident: a patient death. By the time these latent failures contribute to a catastrophic event, they have been in place for so long that they are considered “normal” to all but the most incisive, well-trained investigator. Therefore, the business of “safety” in all industries, including healthcare, should be left to the trained and committed experts.
Safety in commercial aviation provides a good illustration of the difficulties healthcare will face in making true and sustained safety breakthroughs. A substantial effort over some 60 years has brought us to the present day where fatalities are quite rare. Indeed, today marks more than 3 years since 50 people died in the crash of Continental Connection Flight 3407 near Buffalo, NY. That crash on February 12, 2009 marked the last fatal airline accident in the U.S. Since that day, our commercial air carrier system has operated more than 30 million flights and carried over 2 billion passengers without a single fatality. And while that remarkable statistic suggests a crowning achievement worthy of a day-of-rest for the safety team, that could not be further from the truth. Today, the safety machinery in aviation is humming as loudly as it ever has.
To understand the history of commercial aviation is to know the necessary steps healthcare must take for patient safety. Yet, absent from healthcare are two of the most essential components responsible for the progress of aviation safety: 1) the sensationalism of a spectacular crash, and 2) the subsequent exhaustive investigation. In the absence of sensationalism, and with healthcare-style secrecy, aviation would not have reached this milestone in a paltry 60 years. And healthcare stands to endure the status quo for at least 60 more years unless these industry-specific handicaps are overcome.
And while we imagine that, as a result of a high “body count”, the unencumbered aviation community made instant and striking modifications to its practices, that is clearly not the case. For, even with the galvanizing effect of a series of major air disasters, the subsequent changes took decades of work spearheaded by safety experts. To illustrate, let’s look at a single element of the entire realm of aviation safety: Crew Resource Management (“CRM”). Although CRM is a cornerstone of the management of safety, it embraces only a fraction of the entire scope of system-wide safety. CRM is the study of, and training in, cognitive and inter-personal skills: the non-technical behaviors of workers in a high-risk workplace. It addresses the human components of error: interaction, communication, interpretation, perception, fatigue, behavior, and other non-technical aspects of job performance.
On the night of December 29, 1972, 101 people died when a giant, brand-new Lockheed L-1011 crashed into the Florida Everglades. The crash of Eastern Flight 401 was attributed to “pilot error.” Indeed, a burnt-out landing gear indicator light bulb, and subsequent preoccupation by all three pilots, ultimately resulted in disaster. Despite a high death toll and obvious human behavioral factors, the need for CRM would go unrecognized for many more years.
Then, on March 27, 1977 in Tenerife, Canary Islands, two fully-loaded Boeing 747’s collided on the runway in heavy fog. The captain of the KLM 747 believed that he had received clearance to takeoff, and that the fog-shrouded runway ahead of him was empty. In fact, he had not yet been given clearance to take off since a Pan Am 747 was occupying the middle of the runway. Although substantially more cognizant of the critical conflict, the two junior pilots of the KLM jet could not break the captain of his powerful mindset. Despite questions and protests of the other 2 pilots, the KLM captain set takeoff thrust and began the accident sequence which would kill 583 people in the next 30 seconds: to this day, still the largest death toll in aviation history.
The spectacular Tenerife crash illuminated many problems, including the dangers of unclear communications and steep authority gradients in the high-risk aircraft cockpit. Yet the consequent changes did not happen quickly enough. On December 28, 1978, United Airlines Flight 173, a Douglas DC-8, ran out of fuel near the Portland, OR airport. There were 10 fatalities. During the course of the investigation, the original question “How can a jetliner run out of fuel?” — became — “How can two junior pilots tell a captain that their jet is running out of fuel?” In fact, as a cause of the disaster, the NTSB cited the following as a contributing factor: “The failure of the other two flight crewmembers…to successfully communicate their concern to the captain.” They knew the plane was running out of fuel!
And while the Portland crash inspired United Airlines to develop its own CRM training for its crews, most other airlines moved very slowly on the emerging issue of crew behavior. Then, on January 13, 1982, Air Florida Flight 90 crashed on takeoff into the 14th Street Bridge at Washington National Airport. The subsequent investigation revealed that although the first officer had serious concerns about the performance of the aircraft engines in the heavy snow, his words were ignored by the captain as the plane accelerated down the runway. The Air Florida crash further clarified the issue of team communication and coordination, rather than technical skill, as the fundamental cause of most air disasters.
The need for structured communication and team coordination – CRM – slowly became clear over the course of a decade of air disasters with similar underlying themes. Yet, in retrospect, progress was slow despite the sensationalism of front-page news coverage, along with an utter lack of secrecy: flight data and cockpit voice recorders indeed told the precise story. Further, during this period of increased awareness of human error, the aviation industry utilized a focused core of safety experts who kept their eye on the safety ball at all times. To the aviation industry leaders of the 1970’s, the “miracle” of our past 3 years of fatality-free flying was an unattainable fantasy. Or was it?
The preceding illustration of progress in aviation safety, via the CRM example, illustrates that even with front-page news coverage and detailed factual investigations, the pace of change for safety will be slow. The healthcare industry is indeed constrained by the invisibility of small-scale catastrophes and secrecy surrounding them. The barriers posed by the virtual insignificance and concealment of individual error-related deaths will take time to dismantle. But waiting for that clarity and openness is not the answer. What we can do today, however, is to enlist the help of true safety experts: those whose careers are dedicated to the science of safety. To those people, the similarities between aviation and healthcare are palpable: we don’t have to re-invent the wheel.
In healthcare, we may never benefit from the sense of urgency which inspired the successful changes within the aviation industry. So we are faced with a choice: do we argue that safety in healthcare is doing just fine within our constrained system of quiet harm and secrecy? Or do we presume that errors in healthcare are, at their root, caused by the very same human weaknesses that have resulted in well-documented catastrophes in aviation and every other high-risk industry? The choice is ours: innovate, or capitulate.
Michael Appel is an anesthesiologist, former airline pilot, and safety consultant. He is a speaker and frequent contributor to The Salus Network.
Submit a guest post and be heard on social media’s leading physician voice.