Feedback refers to information describing the students’ performance in a given activity meant for guiding their future performance and hence is a key step in the acquisition of clinical skills (12). From our study, OSCE feedback in the form of either FTF or an EW format was generally perceived to be beneficial and both methods were well-received by both students and examiners. In comparing both methods of feedback, there was a stronger perception that EW feedback was more beneficial in improving future performance and students felt more comfortable in receiving feedback in the written form over FTF feedback.
The utility of written feedback for OSCE has been described in several studies. In one of the first publications on OSCE feedback, students were given their marked checklists immediately after the OSCE and this was followed by either observing another student’s performance or watching a video tape of an experienced examiner (7). Taylor and Green compared two methods of written feedback; the first was skills-based where students’ performance in all the OSCE stations were collapsed into seven skills areas (eg history taking, examination) and graded accordingly (13). The second intervention (station based) consisting of examiners’ comments on students’ performance on each task was rated higher in terms of students’ satisfaction although subsequent performances did not differ between both groups. The authors concluded that skills-based feedback may not be specific enough for students to improve upon whereas examiners’ comments may have been too evaluative of the student rather than the task itself (13).
Students appeared to value detailed information about the individual OSCE stations and this was reflected in our study where the EW feedback was preferred. The answer guide was regarded as the most valuable item in the written feedback as it provided the expected answers thus enabling students to identify their gaps with ease. Answer guides may be disseminated in a formative OSCE as a means to aid students’ learning but this practice may be more restrictive in a high-stakes, summative OSCE due to test security restrictions and institutional policies. In addition, the release of answer guides can be both beneficial and problematic. On one hand, it may encourage deep learning if the students attempt to fully understand its context and conversely, its provision may encourage some students to memorise a collection of isolated facts without engaging an active learning process.
Our students found that examiners’ free text comments were quite useful and this form of feedback was ranked second after “provision of answer guide”. As feedback should be specific and documents the student’s strengths and areas for improvement, the comments by our examiners may be lacking as 44% of students perceived a need for examiners’ training in this respect. It is also possible that some examiners faced challenges putting their thoughts in written form within the time frame as about a quarter of examiners felt there were inadequate time. In this respect, the use of audio feedback in which voice recordings of the examiners are delivered to students after the OSCE (8) enabling students to receive personalised and detailed feedback is a viable alternative option.
The rich information from the scoring matrix used for the OSCE can be maximised as a form of feedback. Harrison described an interactive website used to deliver summative OSCE feedback in various forms to the students: either station by station or on domains common across stations with information on the students’ performance and graphical comparison with the entire cohort (14). Whether this form of information-rich feedback will improve performance should be evaluated as compared to feedback through comments, provision of marks or grades alone may have little influence on future performance (15, 16). Our students rated the provision of information on their grades in the scoring matrix as the least useful type of written feedback and this is probably due to lack of specific details for them to identify their gaps. In fact, the provision of marks in addition to comments has been reported to reduce the value placed on the comments provided by examiners (1, 16).
Face to face feedback has been used in OSCE (4–6). Time constraints may be prohibitive and in our study, the inclusion of 2 minutes to each 12 minutes cycle in an OSCE prolonged testing time by 16%. In the study by Hodder, 2 minutes of immediate feedback significantly improved competency in the performance of criterion based task, at least over the short term where retesting occurred immediately after the feedback (4). However, the challenges of retaining and integrating information from FTF feedback may be compounded by factors such as the anxiety it may provoke and distraction as students need to move to another station with new tasks. Poor recall of the content of the immediate verbal feedback items has also been reported. In a study where residents were given two minutes of verbal feedback during their OSCE, they recalled very few feedback points immediately after the OSCE and 1 month later and what they recalled lack were not reflective of the actual feedback given (6). Poor recall of verbal feedback received during the OSCE was also reported among dental residents when retested 2 months later (17).
Poor recall of verbal feedback does not necessarily implied that it is of limited value. In fact, immediate face to face feedback is still highly valued especially by “Western” learners due to the opportunity to have an interactive exchange with examiners and gaining a deeper understanding of an individual’s performance (11). As the majority of our students were Malaysians (96%), cultural influences perceived to influence learning may have contributed to the preference for written feedback over face to face feedback. Studies have shown that Asian learners prefer teacher-centred learning where teachers provide the necessary information and students’ communication with teachers is implicit and indirect whereas learners from Western countries prefer student-centred learning and they value the opportunity for explicit verbal communication and independent learning (10, 18). The EW feedback we provided fits the Asian students’ learning approaches hence contributing to students’ perceptions on its benefits.
Interestingly, the minority group of students in our study who preferred face to face feedback performed better in the OSCE exam. There are several potential explanations for this observation.
-
Self-confident, strong performers did not receive challenging FTF feedback
-
The OSCE examination format matched the active, student-centred learning approaches of those who preferred FTF feedback.
-
In general, Asian students are “face conscious” and fear embarrassment especially when their knowledge gaps are exposed (19) hence poor performers preferred to avoid a FTF discussion.
Analysis of the survey results amongst our staff examiners showed no significant difference on their views of either method of feedback although inferences are limited by the small number of respondents. It is interesting to note however, a greater proportion of examiners (92.9%) felt prepared to provide FTF feedback in comparison to students’ perception on examiners’ preparedness (75%). Similarly, only a small percentage of examiners (21.4% in F2F, 14% in EW) felt that they required further training to provide feedback in contrast to a larger percentage of students who perceived that examiners required training (53.1% in FTF, 44% in EW). Although we did not audit the quality of feedback, the incongruence between students’ and examiners’ perceptions suggests that formal training among examiners may be beneficial in the long run.
Our study has several limitations. We only measured students’ and examiners’ perceptions towards both methods of feedback which may not translate into actual future performance. Reviewing their performance in the summative OSCE may gauge the impact of feedback but would not have distinguished which method worked better as students received both forms of feedback. This study was conducted at a single site which may limit its generalizability. That said, our findings contribute to the current pool of data on OSCE feedback from an Asian country where literature is limited. To explore the students’ perception further, focus group discussions could be useful. Lastly, it is important to study how students utilised the feedback provided to them. Although feedback is the most powerful single moderator that enhances achievement, the self-strategies that students develop can significantly alter the consequences of this feedback (20).