Ebola is in the United States! Everybody (please don’t) panic! Quarantine all Texans! Though that might be a good idea anyway (just kidding). More on Ebola in general in another post if I have time.
First off though, we’ve found out more information about the sequence of events leading to the hospitalization of the patient, Thomas Duncan. Apparently, he came to the hospital, told people he’d come back from Liberia, and was still discharged home. He was only admitted 2 days later when his nephew called the CDC. The tale reveals a fascinating story of medical quality and system error.
First off, a brief primer on system errors: The concept is this: A mistake only happens when a bunch of events align to allow a mistake to happen. This is called the Swiss Cheese Model: something bad happens only when all the holes in the cheese align, and something gets through.
Let me explain with a story: Imagine that you are driving on a freeway. Someone in front of you is in a new part of the state, and realizes that they needed to get off of on an exit rapidly coming up that wasn’t properly marked. They brake suddenly. Now we come to you. Normally, you leave the appropriate 4 second following distance that we were all taught (ha ha ha), but today you’re running late and you’re just a few car lengths behind the car in front you, going 60mph on the freeway. And, your mom just called and you’re distracted trying to answer. You end up crashing into the car ahead of you.
What happened? Multiple events aligned.
1. You were distracted with answering a call from your Mom. If you weren’t, you might have braked or merged into another lane in time.
2. You were late, meaning that you weren’t giving as much following distance as you usually do. If you had more time, the distraction of your Mom’s call wouldn’t have mattered.
3. The driver in front of you braked suddenly, without realizing how close the car behind him/her was. If they’d been paying attention to their rear view mirror, they might have been able to avoid the accident by not braking. But, they were likely distracted by being in a new part of the state.
4. The driver in front of you didn’t realize that the exit was coming up, because he/she was new to the area. If they’d known, they’d have merged comfortably much earlier and been fine. Again, they were new to the area.
5. The exit wasn’t well marked, and the road with the exit has two names. The driver didn’t realize that signs 2 miles out referred to the exit that they wanted to get off at- only the sign with a quarter mile to do (15 seconds at 60 mph) had both names.
The blue text is the ultimate cause: Both drivers were distracted. The black text represents the many reasons and individual failures mentioned above. The red text are “root causes” that underlay the individual failures.
Of course, in a court of law, all of this is irrelevant — fault basically falls on your shoulders since you rear-ended the car in front of you. But still — you can see how the issue is more complicated than that summary might appear, and some possible solutions- starting with better signage.
So, back to the sequence of events that led to Ebola in Dallas.
1. Thomas Duncan flew back to the U.S. from Liberia, and got through the airport screening by lying about having contact with sick relatives — and because he didn’t have any symptoms of Ebola at the time the airport screeners were checking. Here’s the first mistake. Now, we can talk about how Mr. Duncan lied, but that’s irrelevant: Of course people will lie to return to their home countries and get out of a partially quarantined nation. How can we make the system stronger? Why not require all people leaving the country to spend 4 to 5 days at the airport in isolation, being monitored for symptoms, before getting on a flight? In that time, fevers will (hopefully) develop, and travelers with Ebola can be detected.
Next, Mr. Duncan arrived home, and went to the hospital with fevers, nausea, and abdominal pain. He told the nurse that he’d been to Liberia, but again denied contact with sick people. The nurse put down the travel in the nursing history. The doctor examining Mr. Duncan didn’t see that part of the history, because it was in the nursing section in the computerized medical record- which doctors don’t routinely check. He or she thought Mr. Duncan’s symptoms weren’t too severe, and sent him home.
Many failures here.
2. Mr. Duncan lied again about sick contacts. His incentive for doing so is less clear this time: Perhaps a campaign to persuade the Liberian community/recent arrivals from infected areas to be honest and forthright to rapidly secure the best Ebola treatment might be of use.
3. The nurse took the travel history, heard that the patient was recently returned from Liberia, and didn’t inform the doctor personally. Here’s the thing: Nurses are not required to inform doctors about everything they do and document. But, important things are generally personally communicated to the physician. There is a lot of judgment and skill here; some nurses call the doctor about everything, slowing care down and distracting the physician from focusing on the important things. Some don’t communicate enough, and the patient suffers because important symptoms are missed. The nurse here didn’t recognize the importance of the travel history in light of the patient’s symptoms, More education here might help.
4. The travel history didn’t show up in the doctor’s version of the EMR. EMRs in general are quite frankly terrible, Almost no one likes the one they have or has serious complaints. In this specific case though, the error was one of choice: Not all the information that nursing collects has to be reviewed by the doctor. It can be, but it doesn’t have to be. Vitals for example, are always shown. But the last thing the patient ate? Usually not relevant or useful. It can be checked if the patient needs to be intubated, but it’s not routinely reviewed by the doctor. Every hospital and EMR company makes choices about what information shows up routinely in what part, and this hospital chose not to have the travel history show up. The hospital has since announced that it has changed this, and that travel history will now be routinely displayed.
The patient went home, and exposed the people around him (including children) to Ebola for two more days.
So, what do we have here? An object lesson of system errors. Any of four things could have stopped this: different procedures at the airport in Liberia, more openness on the part of people like Mr. Duncan, better training of nurses, and a correction of the travel history.
Now, what “saved” the day and ultimately got Mr. Duncan admitted and his contacts isolated? A nephew, who heard what was happening and called the CDC, who then came down like a ton of bricks on the situation. Time will tell how much damage was caused by the 2 days of delay, but regardless it’s better than if Mr. Duncan had died at home with no care and no one knew about his Ebola infection until the next (10?) patient(s) got sick.
This was a protective event, a block of cheese where the hole didn’t line up, and it is important to analyze these too. In the hospital every day, a doctor might prescribe a medication that is dangerous for a certain patient to receive- for example, aspirin is pretty safe for adults, but in kids it can cause Reye’s syndrome, a dangerous disease. The pharmacist dispensing these medication will call the doctor to make sure that the physician really wanted that drug at that dose for the patient. Usually, the physician will confirm that the benefits outweigh the risks, and to go ahead (aspirin is the best treatment for a condition called Kawasaki’s disease in kids — otherwise, it’s never given to them). But every now and then, the pharmacist’s call will reveal an error, and cause the doctor to change their order.
In this case, Mr. Duncan’s nephew’s phone call was a safety factor. How can we encourage similar actions by others? Why not a campaign in the Liberian/Sierra Leone community that encourages people to call on behalf of sick relatives recently returned from infected areas, to get them the best care faster?
This process of analyzing errors in any process is critical to quality improvement, nowhere more so than in medicine. While tragic and scary, this Ebola scare provides a valuable lesson in the application of systems thinking to a practical problem — and some suggestions for preventing another occurrence of something similar.
Vamsi Aribindi is a medical student who blogs at The Medical Intellectual.
Image credit: Shutterstock.com