We developed and implemented an active learning session for Internal Medicine clerkship students on antibiotic utilization, with deliberate incorporation of student feedback to modify future sessions. In our study, the use of a PDSA cycle, with iterative modifications based on qualitative student feedback, led to more favorable student ratings in terms of the session format, self-assessed understanding of antibiotics, and their appropriate clinical application. Narrative comments also improved with associated changes in the constructive themes. Even at baseline, student satisfaction with the “active learning” format of the presentation was high (97% of respondents were strongly agreed or agreed that they enjoyed the “active learning” format of the session). Our findings are consistent with the growing literature on gaming in medical education, which suggests high levels of enjoyment and engagement [12], particularly for sessions designed to include opportunities for accomplishment, discovery and bonding [13]– all of which were incorporated into the instructional design for this session.
Despite the high baseline satisfaction, we introduced intentional modifications to the session format based on student feedback. We removed the first part of the session (student completion of an antibiotic chart) in order to have more time for the clinical case discussions, altered the means by which groups could submit their answers (raising their hands rather than placing index cards in a paper bag), and improved organization of the handouts. Following each intervention, student evaluations provided evidence whether interventions improved the sessions and to identify additional areas for improvement. Of note, Intervention 1 was purposefully implemented after six sessions of observation. Considering this was a novel educational session, we felt it was important to ensure robust baseline analysis in the “Study” phase before reengaging in the “Act” phase of the PDSA cycle.
The quantitative data showed statistically significant improvements over time for both learner enjoyment of the session format (baseline vs. Intervention 3; p = 0.043), and learner-rated understanding of antibiotics and their appropriate clinical use (baseline vs. Intervention 3; p = 0.012). Moreover, the content of the qualitative comments evolved over time, such that the only constructive theme identified in the final phase was a request for the session to be longer. Both the quantitative and qualitative data suggest that the iterative modifications, driven by student feedback, were successful in addressing student concerns and improving the quality and impact of the session.
Of note, the quantitative data did show a decline during the Intervention 2 phase; data during this phase represents a cohort of students who were on their first clinical rotation, and may thus have felt less comfortable with the session format and content as compared to students who are further along in their clinical training. For example, one student commented, “Some of the content was a little above my level of training; however this could be b/c of my lacking in understanding.” The qualitative data from this group also indicated a high level of satisfaction and reflected the successful incorporation of the previous session improvements. As one student commented: “Thanks for putting together the chart- super helpful” – directly reflecting the impact of Intervention 1 (removal of the chart completion exercise).
Several student comments centered on the time allotted for the session, requesting more time for case discussions; however, the instructors were not able to lengthen the session time allotted due to the limitations of the Internal Medicine clerkship didactic structure. Overall, the strengths of this study include the high response rate and multimodal data collection (quantitative and qualitative), which allowed the faculty to implement and assess the impact of iterative changes to the session format based on student feedback.
Some study limitations include that each student participated in the session once (as part of the Internal Medicine clerkship), therefore the evaluation data reflects different cohorts of students. Thus, the change in results over time may be due to respondent variability rather than based on interventions. Finally, assessment of the impact of the session in achieving the learning objectives relied on learner self-ratings, rather than an independent knowledge assessment.
Future directions for this work may include the incorporation of additional sources of data, such as a standardized assessment of knowledge and application of core concepts in antibiotic utilization, and expanding the time for the session as suggested by learners in multiple phases of the study.