This study validated whether additional practice of written clinical cases with reflection and immediate feedback (experimental group) in traditional dermatology electives could improve the ability of medical students to evaluate skin lesions, compared with a didactic lecture-based approach (lecture group) and dermatology elective alone (no intervention group). We found that our educational intervention selectively increased the diagnostic accuracy among students enrolled in dermatology electives for the training set only. The effect size was large, 1.2 SD and 1.5 SD compared with the lecture and no intervention groups, respectively.
Most studies on medical education compared the efficacy between an educational intervention and without an intervention. However, effect sizes tend to be overestimated under these conditions [19]. Our results suggested that practicing written clinical cases with reflection and feedback yielded additional benefits to a traditional dermatology elective. A dermatology elective is the most effective way to improve diagnostic accuracy in skin lesions according to a meta-analysis [15]. However, dermatology electives are not available for all medical students. For example, only half of the medical students at SNUCM can attend dermatology electives. Furthermore, students’ abilities after electives often remain unsatisfactory [20]. A study in the UK showed that medical students were not able to identify 67% of common skin lesions after completing dermatology electives, even though the overall diagnostic accuracy substantially improved from 11–33% [8]. Therefore, more effective educational strategies to improve clinical reasoning in dermatology are needed, even for those who have a chance to attend dermatology electives.
In this study, our experimental intervention improved scores for the training set only. In other words, students in the experimental group appeared to only develop clinical reasoning skills of the diseases they had practiced in a training session. These findings are consistent with the belief that clinical reasoning skills are developed based on a disease-specific manner, not by improving general clinical reasoning ability [10].
We did not find any additional effect of didactic lecture compared with dermatology elective alone (no intervention group). The clinical cases in the lecture group was identical with those of the experimental group, and the actual duration of educational intervention was not different between the groups (2 hours). In this study, students attended outpatient clinic while they did not receive extra educational intervention. Therefore, the difference between the groups originated from the difference in educational interventions.
Lecture (didactic approach) is the most frequent scope of educational practice to enhance participants’ abilities to diagnose skin lesions, and it has a medium effect size compared with no intervention [15]. For the control set, we also found that the final scores of the students improved compared with the baseline scores. However, our results showed that the students in the lecture group did not achieve a better diagnostic accuracy compared with the students in the no intervention group. These data could be partly explained by the following: (1) students assigned to the no intervention group actually received educational intervention, because all students in this study attended dermatology electives; and (2) all students had already taken 10-hour dermatology lectures before attending the electives. Thus, they already had basic knowledge on dermatology.
In medicine, reflection is defined as the consideration of the larger context, the meaning, and the implications of an experience or action [14]. However, in clinical teaching, it is probably employed less frequently than it should be, although it is essential in educating physicians in order to help them build illness scripts [14]. To develop diagnostic competence, students must construct their own illness script based on the patient care that they have experienced because illness scripts cannot be transmitted directly from teachers to students [6, 21]. However, experience alone is not sufficient for this process. The experience must be interpreted and integrated into existing knowledge structures to become new or expanded knowledge for the students, through reflection [22]. The role of clinical teachers is to facilitate students’ reflection, and for this to be effective it requires both support and challenge [22].
In this study, we used reflection, wherein students were encouraged to reflect on the case findings that were essential to reaching the correct diagnosis. The clinical teacher should point out diagnostically meaningful information in the data of the case, identify redundant or irrelevant findings, and highlight the discriminating features, including their relative weight or importance for drawing conclusions about the correct diagnosis [12]. Some of the students assigned to the other groups (no intervention or lecture groups) could use reflection depending on their preference. In contrast, all students in the experimental group was forced to reflect on every case.
Reflection enables students to improve their clinical reasoning skills, even though they do not receive feedback [6]. However, feedback is necessary for an effective reflection [22]. Feedback is defined as specific information regarding the comparison between a trainee's observed performance and a standard, given with the intent of improving the trainee's performance [23]. Feedback has a beneficial effect on clinical reasoning [12, 13] and skill acquisition [24]. Unlike physical skills that an instructor can directly observe, it is difficult to give appropriate feedback about students’ mental processes, such as clinical reasoning, without knowing their thoughts. Therefore, we asked students to verbalise their thoughts instead of thinking internally, so that they may receive relevant feedback that would help them to refine their thought processes.
There are two types of feedback, which are as follows: process-oriented and outcome-oriented feedback [25]. The type of feedback used in this study was process-oriented. The goal of process-oriented feedback is to provide new instruction about how to reach a standard, rather than informing the student solely about correctness [26]. In contrast, outcome-oriented feedback provides performance-specific information (i.e., information that focuses solely on correctness) [27]. Different types of feedback can have varying influences on learning. Research regarding the training of medical students has supported the beneficial effects of process-oriented feedback compared with outcome-oriented feedback [26, 27]. For example, process-oriented feedback had a more positive influence on medical students’ efficiency in learning laparoscopic knot-tying compared with outcome-oriented feedback [26].
This study had some limitations. First, we simply chose the sample size based on realistic reasons (the number of students participating dermatology electives) instead of calculating the sample size to obtain enough statistical power. Potential difference in students’ experiences during the periods was inevitable and might influence the results. However, the control of students’ experiences throughout the electives was not feasible. To distribute potential confounders across groups, we designed a randomized controlled trial. However, regardless of random allocation, our sample size might not be big enough to prevent this problem completely.
Second, diagnostic difficulty in control and training sets might be different. Diseases of the control set were relatively easier to diagnose than those of the training set. Diseases in the control set are generally diagnosed through morphologic characteristics. In this study, final scores for the control set improved without inter-group differences. By contrast, diseases in the training set usually require additional information to morphologic characteristics. Actually, cases in the training set were misdiagnosed by non-dermatologists at first. Nevertheless, the key finding that only students in the experimental group showed improvement in the training set is still valid.
Lastly, we did not examine the long-term effects of the intervention. One study evaluating the effectiveness of dermatology electives showed that half of the initial effects disappeared after 12 months [8]. Only a limited number of students receive dermatology education after completing the dermatology electives; thus, further study is needed to evaluate the long-term effects of our educational intervention.