Why rural hospitals are fighting a losing battle

Rural hospitals are fighting for their lives. Over the past five years, more than 40 rural facilities have closed their doors due to lack of funding. And because the majority of their funds come fromMedicare and Medicaid — two government programs facing potential cutbacks in 2015 — many rural hospitals may be fighting a losing battle.

 

Understandably, small-town residents fear hospital closures or downsizing may leave them vulnerable when serious illness strikes. But the reality is patients in rural communities often don’t receive optimal care from their local hospitals. In fact, critical access hospitals in rural areas experienced increased death rates from 2002 to 2010 while mortality rates fell in other hospitals.

It’s time to face facts: Complex medical care provided in small, community hospitals isn’t necessarily the best thing for people in low-population communities. While there’s no perfect solution, one viable alternative is emerging some 7,000 miles away near the battlefields of Afghanistan. The U.S. military recently changed the way it cares for wounded soldiers during wartime. It achieved major reductions in mortality as a result. And it could offer rural areas a road map for improving care.

But first, let’s examine how our nation’s rural facilities arrived at this critical point,

Rethinking rural hospitals

For 70 years, the strategy for financing and building hospitals in rural and underserved areas was shaped by the landmark Hill-Burton Act. Passed by Congress in 1946 and signed into law by President Truman, the legislation sought to address the shortage of hospital beds and lack of inpatient care options in America’s small towns. Hill-Burton funding fueled the growth of community hospitals starting in the 1950s. Fundamental to Hill-Burton was the belief that residents of rural and low-population areas were best served by local community hospitals, no matter how small.

For a long time, that belief made sense. Medical treatment in the ‘60s and ‘70s was fairly straightforward. Hospitals provided mainly supportive or uncomplicated care to patients who survived heart attacks, strokes or other acute medical problems.

But times changed and so has medicine. Doctors can now unblock blood vessels to the heart and brain, reversing life-threatening acute episodes. Today’s highly trained specialists are now successfully treating medical problems once considered untreatable. But these improvements aren’t universal.

Studies consistently show medical specialists who regularly perform complex procedures achieve better clinical outcomes than physicians who only do them occasionally. What’s more, distance no longer has the same negative impact on medical care it once did. Advances in video conferencing and data gathering allow specialists to provide immediate expertise, regardless of their location. Additionally, transportation options – once considered slow and unreliable in rural settings — are faster and safer, making ambulance or helicopter transport feasible for all but the most critically ill patients.

These advancements in medical practice further exploit the differences between America’s leading hospitals and the kind of care available to many of the 60 million people living in rural areas. The truth is many hospitals serving low-population areas don’t have the patient volume or specialists to manage the breadth of complex medical conditions they encounter today.

That’s why it’s time to change the way rural Americans receive care. And it’s time to seek solutions in unexpected places.

Lessons from the battlefield

There was a time when rural doctors and military doctors applied similar approaches in how they cared for patients with life-threatening conditions. Both attempted to do as much as possible for a patient as close as possible to where the injury or illness occurred. Both believed that providing definitive treatment sooner would yield better results than if they transported their patient to another facility.

While many doctors in rural hospitals still uphold this line of thinking, military thinking has changed.

Military physicians discovered that they could dramatically improve clinical outcomes by stabilizing wounded soldiers before transporting them to high-volume hospitals with state-of-the-art facilities and specialty physicians. And despite the added time needed to transport patients, the military reduced soldier mortality from 24 percent during the Vietnam War to 10 percent in Afghanistan.

This positive experience raises the question: Why can’t rural hospitals adopt this approach?

What if rural facilities were used for the kind of routine care and simple procedures that generalist physicians and nurses can safely provide while designating regional hospitals for more complex, specialty care?

With today’s video technology, a remote specialist can immediately evaluate a patient and initiate care prior to transport, minimizing delays in treatment. After preliminary testing and stabilization, patients could be safely transported to an operating room in a regional hospital for treatment mere minutes after arrival.

Unfortunately, there’s no 100 percent safe way to address the needs of soldiers on the battlefield or people living in rural areas. Some patients will die in transit who might have survived with more immediate care. But overall, more patients will die in sub-optimal hospitals than during transport to state-of-the-art facilities with the best doctors and nurses.

If our goal is to save more lives, we as a country should invest in 21st century technology, communication, and transportation — not in building more rural hospitals. If we follow the military’s lead, the death rate in rural areas is likely to decline just as it did for soldiers on the battlefield.

Robert Pearl is a physician and CEO, The Permanente Medical Group. This article originally appeared on Forbes.com

View 11 Comments >

Most Popular

Join KevinMD Plus and never miss a story.