Somehow, with the changeover to the latest release of our electronic medical record, something happened to some of the demographic information on our patients. Suddenly, an enormous number of patients were now labeled as choosing not to disclose their gender.
When I first noticed this, it was flagged in bright yellow on the banner in a patient’s chart when I was seeing her for a routine office visit. I’ve been taking care of her for over 20 years, and know for a fact that her gender has never been in question, and she was listed appropriately as female during her last office visit with me. It was right there in all the prior notes.
I asked her during the appointment if someone at the front desk had asked her what her gender was, or any other new questions, and whether she had chosen not to disclose for some reason, but she looked at me like I was from another planet. “Why would I not tell them I was a female?”
In her late 90s, with several fully grown children and many grandchildren and great-grandchildren, she actually laughed at the concept that somehow our electronic health record (EHR) suddenly didn’t know what her gender was.
We tried to figure out what happened, since this seems to have occurred sporadically but in a widespread way across our practice with the changeover, and it doesn’t seem to have anything to do with there being new fields for gender identity, sexual orientation, preferred pronouns, or even a new way of entering these items into the updated system. It just happened.
In our faculty meeting earlier this week, many providers reported that in fact, this hiccup had caused some glitches in the functionality of the electronic health records, including many of their male patients now being listed as (long) overdue for mammograms and Pap smears. One of my partners actually got a message from a patient who was quite upset that he’d received a notification through the patient portal that he was overdue for his Pap smear. In the long run, this is likely not going to be a big deal, and I’m sure somewhere someone is working on trying to clean this up, retrieve the previously entered data, and make the records right again.
One of the challenges we’ve had with the electronic health record when it comes to demographic data on our patients has been the struggle of getting the right information entered that allows us to really take better care of our patients.
In older versions of the EHR, sexual orientation and gender identity (SOGI) data was not able to be collected in any systematic way; there was no place to enter preferred pronouns or names; and only certain people had access to the system areas for inputting race or ethnicity data.
Something as simple as preferred appellations has always just defaulted to “Mr.,” “Ms.,” and “Mrs.,” teed up in correspondence by a link from the system to the gender data field. The staff will often try to find ways around this, adding “Dr.” or “MD” onto people’s names, but this just looks like a mess when it gets spit out by the computer: “Dear Mr. Dr. Fred N. Pelzman MD”.
Having the correct information is important not only for a patient’s dignity, but also for making sure that we are able to take the best care of them possible. When researchers are trying to study social determinants of health and their impact on patient’s health and the health care delivered to populations, having the right information matters.
This has long been a challenge, because patients are often quite sensitive when asked at the doctor’s office about their race, ethnicity, preferred language, pronouns, sexual orientation, gender, and various socioeconomic indicators. Doing this in the right place, at the right time, with the right motivations, and trying to reassure our patients that this is being done for the right reasons, is the only way to get this information that people might need to further improve the care of our diverse patient populations.
If a patient’s preferred language is entered incorrectly, then they’re going to start getting announcements, results, educational materials, and reminders from the electronic medical record in a language that they might not understand. And potentially, researchers studying certain populations might include people who shouldn’t be included, thus diluting out the results of their research and the potential impact on those who might need it the most.
This is obviously delicate ground, and we need to make sure that we do this in the most sensitive and appropriate way, always respecting our patient’s privacy, and only doing what’s absolutely necessary to further their own health, not our interests.
So when the system flips a switch somewhere and everything changes, it makes me feel like we didn’t approach this the right way; we were off once again, heavy-handed and trying to take a shortcut that only ended up muddying the waters.
I know for a fact that many interested parties in our community and among the software developers are, with the best of intentions, working incredibly hard to ensure that the care our patients receive is equitable and respectful, and that to the best of our abilities we always do right by them.
We can, and we must, do better.
Image credit: Shutterstock.com