Will video replays lead to safer surgeons?

Just as elite athletes are born with amazing skill, elite surgeons and doctors in other procedure-based specialties are also equipped with innate abilities that others do not possess.  Surgical skill is often difficult to quantify.  Certainly, outcomes data can be obtained and reputations are formed over time.  Years of training allow the truly gifted surgeons to develop their skills and perfect their craft.

However, all surgeons are not created equal.  During training, residents and fellows learn by watching the senior staff.  As they progress in training they begin to perform procedures with guidance and as they near the end of their training they are working independently with minimal oversight.  Once training is over, most surgeons have little or no opportunity to continue to improve their skills.  So, how can we best evaluate surgeons and allow patients to make more informed decisions?

For patients, it can be difficult to choose a competent surgeon.  In the New York Times, the issue of how a patient may best evaluate a particular surgeon’s skill was discussed.  Surgery can be life-saving in certain situations but every procedure has finite risks associated with it.  Complications associated with a particular procedure are issues that patients must consider when choosing a doctor.  The best physicians have learned to minimize complications and are also adept at dealing with them quickly and effectively when they do occur.   Certainly, metrics such as board certifications and memberships in professional organizations (such as the American College of Cardiology) can provide some guidance.

However, most measures of surgical ability are purely indirect–board exams containing multiple choice questions and oral exams just aren’t enough.  In residency and fellowship, a trainee can complete all of the requirements of the the ACGME and be declared graduated — even with substandard surgical skills.

Now, a new study published in New England Journal of Medicine explores a new more direct and objective way to evaluate surgical skill.  Previous studies have focused more on what surgeons may do before or after surgery in the care of their patients and very little focus was placed on what exactly was done in the operating room.  In the new study, researchers brought together a panel of expert surgeons to evaluate 20 other surgeons ability through the use of videotapes of the same surgical procedure obtained from the operating room.

The researchers found that there was a large variance in skills — the evaluators commented that the surgeons rated the lowest had skills similar to those of trainees and that those at the highest end of the ratings were considered “masters.”  For the first time a study now shows what has been intuitive for years — the dexterity of a surgeon makes all the difference in outcome.  The surgeons rated in the lowest quartile took 40% longer to complete their procedures and had much higher complication and mortality rates.  Moreover, those in the highest rated quartile had much lower rates of readmission and re-operation rates.

In addition to evaluation of skill through video review another very reliable source of information is the opinion of the nurses and support staff that work with the surgeons on a daily basis.  Experienced OR nurses are very good at rating the talent of the operating physician.  They quickly recognize gifted hands and can easily point out those that are not.  However, there is no mechanism in place for other staff to provide feedback to a particular surgeon.

As we continue to work towards health care reform, assessing the skill and effectiveness of physicians will be an important part of cost containment.  Significant complications and negative outcomes are costly to both the patient as well as the health care system as a whole.  Objectively evaluating surgical ability may transform the way in which patients and insurers are able to choose physicians to care for themselves and their families.

As physicians we have a responsibility to provide the very best care for our patients.  We must use every tool possible to ensure that we can continue to improve our skills as we progress in our careers.  Evaluations such as video observation should be incorporated into training programs and may also play a role in continuing education for physicians throughout their careers.  Ultimately we must protect patients and improve outcomes – primum non nocere.

Kevin R. Campbell is a cardiac electrophysiologist who blogs at his self-titled site, Dr. Kevin R. Campbell, MD.

Comments are moderated before they are published. Please read the comment policy.

  • Thomas D Guastavino

    The best way to incentivize a surgeon to maintain their skills is to maintain the patients freedom of choice, something that is systematically being taken away for financial and political reasons.
    On the lighter side if I had to choose a surgeon I would choose one with the most abrasive personality. That surgeon has to be getting by on their skills and not their charm,

  • Daniel McDevitt

    Is this being proposed as a replacement for, or an addition to, current “quality” indicators? As surgeons, we are vetted, licensed, monitored, overseen, vexed, harassed, stymied, annoyed, and otherwise abused by a variety of entities that have the stated mission of “improving” our performance. Yet, somehow, we still need more of it. The initial premise that surgeons have little opportunity to improve after training ignores the fact that people learn from repetition and honing of skills over a lifetime. No one thinks that high performing musicians, for instance, never improve after their final lesson.

    I am not against making better surgeons. There is absolutely no level 1 evidence that board certification and state licensure predict any competence whatsoever. If this method produces better results, then it should replace something that doesn’t. Like MOC, for instance.

    • rbthe4th2

      There is a surgeon I know of who was capable of doing easier surgeries, but when it came to the harder stuff, didn’t improve with age. Sadly … was a great guy. So maybe a personalized approach would help, rather than the straight licensing?

      • Daniel McDevitt

        That’s a fair point. I think we all know a surgeon who isn’t quite up to par. In fact, if you don’t know one, maybe it’s you! (humor)

        The fact is that surgery as a profession has been inundated of late by seemingly elegant, but limited in testing, processes to identify quality. But quality itself is difficult to measure. What exactly constitutes a well done appendectomy, for instance?

        When we convert limited data into general rules, we potentially narrow our scope of investigation for better outcomes. Are we really deciding what the “correct” way to do something is or just the “acceptable” way? There is a fundamental difference between the two.

        History is full of “best evidence” and consensus dogma that has later proven to be wrong. Obviously we do not have perfect information and must make choices as best we can given our human limitations, but I grow weary of evermore demands to prove myself.

Most Popular