AI, artificial intelligence, is all the rage right now in medical news media. And this has many practicing physicians, even medical students, concerned. Will AI make diagnose heavy specialties such as dermatology and radiology obsolete? Can AI give rise to new medical specialties? How many tasks traditionally done by doctors will now be handled by AI?
I believe that there is a right to be worried. We do not have any regulatory laws for AI. Moreover, once hospital systems across the nation start implementing AI into medical practice, who will oversee this? For physicians who have practiced medicine for decades, how will they feel about incorporating AI into their day-to-day practice? Will this become an unstated requirement? Or something similar to the electronic health record where a doctor can opt out, but later say yes if the doctor feels it will make patient care and operations more efficient?
Some of us are fearful of AI stepping into the human-driven world of medicine. Our emotional and cognitive intelligence is the accumulation of years, even decades, of experiences since the day we were born. AI is something that can acquire knowledge and information in a vastly short period of time, and most of all can quickly adapt to changing circumstances. AI can learn, like us. That’s terrifying, but at the same time, it is fascinating.
If you are either a med student or a physician, take a step back and see what’s around you. You have your cell phone which has medical apps that you use every day: UpToDate, Epocrates, MDCalc, Doximity, Figure 1, Human Dx, Medscape, and UWorld. At many hospitals, you would see the occasional robot that moves from room to room, from floor to floor, delivering medications, lab samples, and other essential items to medical staff. There are even portable, handheld ultrasound devices that you connect via Bluetooth to your smartphone. The point is that we are entrenched in technology, and we are already comfortable with the assistance we get from these resources. AI will be the ultimate resource that is here to stay.
I remember when Elon Musk did a podcast with Joe Rogan, and he spoke to Congressional leaders time and time again about the risk of AI. I think Elon is right; there are always risks when something new and powerful is set to transform how something has always been done. Here, of course, that is the relationship between AI and medicine and whether or not it will be a symbiotic relationship.
I firmly hold that AI will enrich health care and ultimately help fulfill the quadruple aim of medicine: enhancing the patient experience, improving population health, reducing costs, and improving the work life of health care providers. And ultimately, AI will help minimize the rate of burnout. AI will reduce the rate at which residents and practicing physicians suffer from moral injury and will reduce the rate of physician suicide. Let that sink in.
As a society, we visualize doctors as being paid handsome salaries, driving nice cars, and living in homes only a kid could dream of. We as a society hold doctors as impermeable to the stresses of medicine, and that when doctors are confronted with difficulties and obstacles, they should not complain and just push through it. No! The process of burnout starts as early as medical school and accelerates in residency training. 300 to 400 doctors end their lives every year, and the physician suicide rate is notably higher than the general population. According to the American Psychiatric Association at their 2018 annual meeting, the general population suicide rate is 12.3 per 100,000. It is 28 to 40 for physicians. This is outrageous and damaging to the medical profession, and we need to help doctors now as much as they have helped you and me in our most trying times.
This is where AI comes in. When we want to treat and have face time with our patients, but instead must complete an endless amount of tasks to satisfy hospital administration, is that right? No. We need to streamline our healthcare system, and if AI is what it’s supposed to be, tasks such as ordering labs/imaging for patients, scheduling appointments, calling services for consults, will all be done by an AI-driven system. The only step that would be taken is for the physician to review the AI suggestions and either approve or disapprove. That’s simplifying things, but I hope you get my point.
I am in no way saying that AI is the cure-all for burnout because it certainly is not, but it is a monumental step forward.
My call to action is this: I urge hospital administration and those in roles of leadership to carefully consider every aspect of the integration of AI into health care. It must not only be of significant benefit to patients, to improve their care in and out of the hospital, but AI must also contribute to the sanity and well-being of the physicians who provide this care.
Ton La, Jr. is a medical student and student editor, The New Physician.
Image credit: Shutterstock.com