In the mid-1800s, Ignaz Semmelweis, then a professor at Johns Hopkins, proposed something outrageous: Doctors and medical students working in the maternity ward should wash their hands before delivering babies because doing so could reduce infant mortality. For proof, he performed a rigorous experiment that showed babies delivered by midwives, who traditionally did wash their hands prior to deliveries, had rate of death that was five-fold lower than those delivered by doctors.
He was met with incredulity and scathe from his medical colleagues, who ostracized him and denounced his findings. With no one believing his data, he obsessively and desperately tried to convince anyone of his findings and eventually swirled into depression. He was committed to an asylum where he died an undignified death from sepsis, the very illness he tried to prevent.
Imagine that: Sterility, now considered foundational to proper medical care with hand sanitizers mounted every few feet in hospital hallways, took decades and the findings of Louie Pasteur and Joseph Lister to finally convince the medical community.
Two centuries later, medicine is still stubborn. It often takes decades from a finding to be translated into a policy and then into change in practice. In fact, clinical evidence wasn’t used to inform clinical decisions — what we now know as “evidence-based medicine” and is the gold standard in patient care — until the early nineties.
Medicine is risk-averse and rightfully so. Its history is littered with examples of findings that were too brashly accepted and led to widespread harm and death: the morning sickness drug thalidomide, for example, left tens of thousands of babies with deformed limbs. So naturally, the best way to do no harm is to sometimes take no action at all. It’s why we still rely so heavily on the same antibiotics for infections, insulin for diabetes, or X-rays to look inside our bodies, discoveries that were all made around a century ago. They’re still effective — for now. But current antibiotics are quickly losing their effectiveness, many diabetics still have to poke themselves daily with a needle, and patients are still exposed to harmful radiation.
I’ve spent more than half a year straddling the division between medicine and science by working in a translational cancer lab, and it only recently occurred to me that the likelihood of our findings reaching the clinic is remote — and not because our science is lacking. In fact, our data are promising and our results convincing. Our work may someday offer an alternative to patients wishing to seek a less invasive way to screen for various cancers. But if something so basic as hand washing took decades to finally be accepted, how long, if ever, might ours take?
Science and medicine are seemingly at constant odds, and I feel lucky to have been on both sides during this year — joining in on the push and pull between scientists and clinicians who argue over where the boundary lies between cruel inaction versus brash risk-taking. The scientist in me tells me that a large swath of medicine needs urgent update, but the clinician in me reminds me that medical progress must remain slow and methodical for the sake of our patients’ safety.
Yet in spite of the push and pull, we find ourselves with lifespans at an all-time high, diseases being eradicated, and maternal deaths on the decline. So maybe it doesn’t matter whether it’s physicians or scientists who will win this debate, but that these two sides continue to push forward despite resistance.
Steven Zhang is a medical student who blogs at Scope, where this article originally appeared.
Image credit: Shutterstock.com