What is the best way to choose residents?

On Twitter a while ago, a medical student asked me how surgical program directors select new residents. Then a discussion arose among some academic surgeons on the same topic. Someone suggested that medical school grades were the best way to tell whether an applicant would be a successful resident.

The fact is that we aren’t really sure what the best way to choose residents is.

First, here’s what we really do.

A 2011 paper from the Journal of Surgical Education reported on a survey of general surgery program directors, associate program directors, and department chairs, with 262 (65%) responding.

USMLE Step 1, used by 37% of programs, was the most common applicant screening criterion with USMLE Step 2 second at 24% and graduation from an LCME-accredited US med school third at 15%. The least important criteria were previous research experience and publications.

Final selection criteria were assessed using a Likert scale. The number one factor was the interview followed by the USMLE Step 1 score, letters of recommendation, and USMLE Step 2 score. The least important factor by far was whether an applicant had done a preliminary year. Research, publications, and a previous rotation at the institution also ranked near the bottom. Class ranking, the dean’s letter, and surprisingly, Alpha Omega Alpha status were in the middle.

Responses consisted of 49% from university programs, 38% from university affiliated hospital programs, and 13% from independent community hospital programs. The average number of applicants per program was 571.

The problem is that proof of the value of the above methods of selection is lacking. A paper from Academic Medicine in 2011 reviewed nine studies of USMLE scores and resident performance and found no correlation between those scores and the acquisition of clinical skills by students residents or fellows.

meta-analysis of 80 studies and over 41,000 participants from the journal Medical Education in 2013 found that the USMLE Step 1 scores and medical school grades were associated with better resident performance. However, if you eliminate studies showing that better USMLE scores led to better scores on in-training exams and passing licensing tests, only two studies found that USMLE scores correlated well with subjective ratings of residents.

What about grades?

The authors pointed out that, “These data could potentially be more useful to program directors if grading systems across medical schools were standardized.”

I said the same thing in a previous post.

study of 348 categorical general surgery residents at six residency programs on the west coast looked at resident attrition or need for remediation. The need for remediation was associated with receiving a grade of honors in the medical school surgery clerkship and with a slightly but statistically significantly lower USMLE Step 1 scores. For example, PGY-1 residents needing remediation averaged 225 on USMLE Step 1 vs. 232 for those not needing remediation.

A major issue is the fact that we don’t have good data on the clinical performance of surgical residents or graduates of training programs. I know from personal experience that a good USMLE score or a high score on the surgery in-training exam had a “halo effect” when it came time for faculty to evaluate overall resident performance.

According to the Wall Street Journal, some businesses are asking job applicants of all ages “to provide SAT or ACT scores, results from graduate-school entrance tests and grade-point averages along with their work history.”

Google, a successful company by any measure, does not care much about grades. Here’s what Laszlo Bock, senior vice president of people operations, had to say in the New York Times about employee performance: “One of the things we’ve seen from all our data crunching is that G.P.A.’s are worthless as a criteria [sic] for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation. Google famously used to ask everyone for a transcript and G.P.A.’s and test scores, but we don’t anymore, unless you’re just a few years out of school. We found that they don’t predict anything.”

But the relationship between college grades and performance at Google isn’t the same as the relationship between med school grades and performance as a surgeon. Or maybe it is.

I suppose one could argue that applicants for residencies, who are recent graduates of medical schools, would fall into Google’s “slight correlation” category.

I know one thing: I don’t know how to select applicants who will become good surgeons. Do you?

“Skeptical Scalpel” is a surgeon blogs at his self-titled site, Skeptical Scalpel.

Comments are moderated before they are published. Please read the comment policy.

  • rbthe4th2

    Would a manual dexterity test help? The ability to concentrate and lead thru a lot of craziness?

    • Skeptical Scalpel

      No. Here’s a link to an article about a recent paper which shows that a manual dexterity test (soap carving) was not effective at predicting surgical performance. It on Medscape, which requires registration but it’s free. http://www.medscape.com/viewarticle/819921

      • rbthe4th2

        I would think soap carving is more brute force than trying to manipulate a scapel. More like how do you deal with human skin, some sort of dissection type of exercise?

        • Skeptical Scalpel

          Soap carving is done with a scalpel. A bar of Ivory soap is used. A reproducible model of skin or dissection has not been studied. Anyway, manual dexterity tests do not predict resident performance. There’s much more to becoming a surgeon than manual dexterity.

  • EmilyAnon

    If a trainee is observed to rank low in manual dexterity or fine motor skills during residency, is it assumed they can improve?

    • Skeptical Scalpel

      I think everyone can improve with practice. The question is can they improve enough? My experience is that most can if they put in the effort.

      • rbthe4th2

        What about doing open vs. lap? How can one improve on that?

        • Skeptical Scalpel

          Both can be improved upon with practice. Supposedly simulation works well for laparoscopic practice. I’m not 100% convinced.

  • Patient Kit

    Is there any correlation between the generation who grew up playing complex video games from an early age and the ability to master the more technological aspects of surgery like robotic surgery? Not that I’m implying in any way that that’s all it takes or that every kid who played video games can become a good surgeon. But is it helpful for those who have other qualities it takes? I imagine it would be helpful to have the personality of Mariano Rivera if you want to be a surgeon. Totally clutch and able to perform calmly and excellently under extreme pressure. There are only so many Mariano Riveras. Only one, actually. Yes, I am a Yankees fan. I appreciate excellence in both my surgeons and my baseball team. ;-)

    • Skeptical Scalpel

      There have been papers claiming that video gamers make better laparoscopic surgeons. Yes, to have an attitude like Mariano Rivera would be a good quality for a surgeon. Mariano never got angry, got over his rare bad outings quickly, and usually put in a masterful performance.

Most Popular