Like most medical trainees my age, I step into the hospital with the entire knowledge base of the world in my pocket. I have apps on my phone to look up disease symptoms and diagnoses, reference drug doses and side effects, estimate the 10-year risk of having a heart attack, determine the correct timing and type of vaccine administration and screening tests, and even display the risks, benefits, and statistics of various types of contraception.
Importantly, this incredible ability to access and share information quickly has changed not only how we practice medicine, but also how we teach and learn it.
The difference that technology has made in modern medical training was evident from the first day of medical school. Lectures are video-captured and can be played back over the internet at any time. Furthermore, our upper-class colleagues advised us of a breathtaking proliferation of internet-based resources, which often take advantage of videos, animations, and other multimedia to disseminate content. No longer are students routinely attending a lecture while taking notes in the margins of a thick, dense textbook; instead, we design their own learning plans using a buffet of resources.
This culture shift has led to something of a generational rift between current trainees and those who learned medicine in an earlier era.
Many of our outstanding faculty and physician-mentors trained in an era when a more hierarchical transfer of knowledge was the norm. I hesitate to speak for my senior mentors, but I can imagine that without the internet at your fingertips, the best sources of knowledge are the teacher at the front of the room and the textbook on your desk. Any deemphasizing of the lecture could unfortunately also be seen as a devaluation of the expertise and years of training that our senior professors have achieved.
In contrast, the entire premise of crowdsourcing is that everybody has the right and ability to contribute on an equal level — any hierarchical barriers are disrupted. In the age of Google and Wikipedia, my generation is accustomed to accessing information ourselves, quickly and in summarized form, and sharing it with others. The result is that there is less tolerance of the traditional lecture-plus-textbook format of learning; why place full reliance on only one person and one textbook, when the wisdom of millions of people is available at the click of a button?
The good news is that I firmly believe that this gap can, and will, be bridged.
We are seeing more and more examples of this here at Stanford and, from what I hear and read, at medical schools throughout the country. In-class sessions are beginning to avoid drilling material that can be referenced on a smartphone app or easily memorized using a deck of electronic flashcards; this is in the wheelhouse of the millennial’s self-study habits. Instead, sessions are being crafted around conceptual explanations rather than facts, interactive learning that immediately challenges students to apply facts and knowledge, and chalk talks that avoid relying on stale PowerPoint slides.
On our end, my colleagues and I need to be careful not to close our minds prematurely to certain curriculum material just because it might be accessible to us in some other form. The internet is a great supplement, but not a perfect substitute, for a human being who has accumulated years of experience — and even the most self-sufficient learners can benefit from a trusted and experienced guide.
Nathaniel Fleming is a medical student who blogs at Scope, where this article originally appeared.
Image credit: Shutterstock.com