Just a few weeks ago, the President’s Council of Advisors on Science and Technology (PCAST) issued a report, “A Transformational Effort on Patient Safety.” This was both a “surprise” and “not a surprise” to me because I’m a nurse, a subject matter expert in patient safety, and a shop teacher’s daughter.
PCAST’s call for action was not a surprise because in 2023, care intended to help patients harms them at an alarming rate. One in four Medicare beneficiaries will experience an adverse event while hospitalized. Upwards of 250,000 people in the U.S. die each year because of medical errors. Good people, smart people—epidemiologists, clinicians, data wizards—quibble about numbers and methodologies, parse denominators, and argue about the precise incidence of serious harm and loss of life. But key takeaways emerge from any analysis: the way health care is organized and delivered poses a serious threat to our collective well-being.
Today’s findings are not materially different from those that drove two seminal reports, “To Err is Human” in 1999 and “Crossing the Quality Chasm” in 2001, published by what was then called the Institute of Medicine (IOM). It’s not lost on me—the shop teacher’s daughter part of me—that profound deficits in our collective ability to deliver reliably safe care have persisted for so long that the names of esteemed institutions charged with driving improvement have, themselves, changed.
“Shop” or industrial arts, for readers not familiar with this course of study, used to be taught in public schools in the U.S. It’s the class where students learned to cut wood on a bandsaw and pour molten metal into molds and take engines apart. Shop teachers expected their students to protect things of value—things like fingers and FAS-grade hardwoods—by following procedural rules. Shop was full of expectations like “measure twice, cut once.”
But my dad expected far more. He was a man who listened carefully to stories of “things gone wrong,” a man who shook his head at things set in motion by the sometimes-sketchy choices of fallible humans. “Fail to plan, plan to fail,” he’d say after pulling a car out of the ditch for a neighbor who had delayed putting on snow tires. Not to the neighbor, of course, but at the dinner table. I grew up anticipating my dad’s questions: Did you fall short because you didn’t have a sound plan? Or did you fall short because you didn’t execute a sound plan well?
Health care stakeholders must similarly consider these fundamental questions about the progress that’s been made and the gaps that remain nearly twenty-five years after the IOM reports. Our nation’s early efforts to improve the safety of care were midwifed by health care policy and physician experts, caring clinicians, and academics, tired of seeing the good intentions of dedicated people go wrong.
In the early days of patient safety, seasoned clinicians, like me, learned our cultural norms were problematic. We held erroneous beliefs about human fallibility, believing dedicated, educated, positively motivated people would not err. These faulty notions gave rise to care delivery systems that did not reasonably anticipate and correct predictable human errors before they reached patients and caused harm. And our systems of workplace justice fueled these faulty assumptions by punishing people for the wrong things, namely for being human and failing at rates that were wholly predictable given the design of systems they relied upon.
Many good things have come from early work in patient safety. Resident physicians aren’t quite as tired as they once were. Barcode scanning makes medication administration safer in some settings of care. Evidence-based bundles had virtually eradicated a number of health care-associated infections. Many of these bundles fell apart, though, during a global pandemic when clinicians battled COVID on two fronts: fiercely trying to prevent harm to their patients while trying to prevent harm to themselves. And we still struggle with fair accountability in health care workplaces.
This brings me to aspects of “A Transformational Effort on Patient Safety” that surprised, even delighted, me. First, the report was released by the President’s Council of Advisors on Science and Technology, a 28-member group of multi-industry thought leaders charged with recommending policy that makes the best use of science, technology, and marketplace innovation to optimize the well-being of the populace. This is not a plan informed by a “Physician, heal thyself” ethos. The expertise, influence, and vision of this group are evidenced by important changes to the project’s scope: to advance the safety and well-being of not only patients but the health care workforce as well.
Second, the shortlist of recommendations drafted by PCAST’s Working Group on Patient Safety—one that tapped the expertise of renowned clinicians, social scientists, innovators, and patients—is promising. Among specific line items, the report calls for Just Culture to advance patient and clinician safety in health care systems. Rooted in the disciplines of jurisprudence, systems engineering, human factors, and behavioral economics, Just Culture is a values-supportive model of workplace justice. It’s something that’s been missing from many first-generation improvement plans.
The tenets of Just Culture didn’t come from my dad’s shop class, but he would like them. He’d like the justness of a model that appreciates the importance, the necessity of procedural rules yet acknowledges that upholding a greater good or a competing value might cause someone to break them. And he’d like a model of workplace justice that can take on substantive topics—like whether girls get to take shop and the adequacy of the budget to support desired outcomes.
Just Culture is about shared accountability. It holds individuals accountable for the quality of their choices within a framework that considers our human fallibility and what’s known about human performance in complex systems. It holds organizations accountable for the design of systems and appropriate surveillance of mission-critical elements. It’s reassuring to see workplace justice included in PCAST’s urgent call to action. We—patients and providers—need a better plan.
Barbara L. Olson is a nurse and chief clinical officer, The Just Culture Company. In this role, she supports health care clients in planning and sustaining Just Culture as a system of workplace justice. She can be reached on X @safetynurse.