I had an upsetting encounter the other day with a 22-year-old woman, who mentioned (secondary to the purpose of the visit) that she was pretty sure she had breast cancer.
Why did she think that?
She’d found a lump in her breast.
(Somewhat unusually for the specific setting, she let me do a breast exam. All I felt was a small area of lumpy breast tissue, possibly a fibroadenoma at worst. Of course, I would recommend ultrasound and possibly excision, but I wasn’t acting in the capacity of her primary care physician.)
Had she seen a doctor about it?
No.
Why not?
The answer that sent my jaw to the floor:
Cancer means you’re going to die, and the only thing the doctors want to do is give you chemotherapy to make you as miserable as possible for as long as possible, so they can make as much money as they can off you.
No, she was not being sarcastic, or ironic, or anything but sadly sincere. I was taken completely aback.
I pulled my chair around to the other side of the desk, and asked her to give me her hands, which I held tightly while gazing straight into her eyes. As sincerely and seriously as I possibly could, I told her that no, that’s not what doctors did. At least none of the ones I know, and I know a lot of cancer doctors, every last one of whom is a particularly shining example of the best of the warmth, caring, and compassion to be found in the medical profession.
And saddest of all: I don’t think she believed me.
So yes, apparently in this day and age, there is still widespread belief that:
- Cancer is uniformly and universally fatal.
- Doctors inflict suffering for the sole reason of making money.
- Making money is the only reason doctors treat cancer.
What can one little dinosaur do? Other than hold someone’s hands, gaze deeply into their eyes, and pour my soul into every word, not much else. I can only hope.
Lucy Hornstein is a family physician who blogs at Musings of a Dinosaur, and is the author of Declarations of a Dinosaur: 10 Laws I’ve Learned as a Family Doctor.
Image credit: Shutterstock.com