Our study explicitly demonstrates the feasibility and effectiveness of e-OSCE as a valuable tool in technology-driven assessment. Except for the assessment of psychomotor and procedural skills, e-OSCE enabled us to evaluate the cognitive and metacognitive domains of final year medical students. Furthermore, this work endorses the feasibility and effectiveness of e-OSCE for using examiner-based simulation scenarios as an attractive alternative to face-to-face OSCEs. In the current tech-savy digital world, our e-OSCE allowed us to assess the required essential clinical skills of students via distance learning platforms. Lastly, recordings of all stations of e-OSCE provided us a permanent digital repository of the entire exam with the opportunity to review any missed or incomprehensible event. Though a majority of survey respondents were satisfied with the planning, organization, and implementation of e-OSCE, we observed some challenges including a restricted blue print, financial expenditures, time and labour intensive and training of students, examiners and administrative staff in a short span. The themes of e-OSCE structure and technology were based on the use of digital literacy, training, paper-less exam, coherence, sound organization secure digital repository of exam data. However, there were concerns about stations contents, time allocation, rest between cycles, and relay of instructions.
Using the first level of Kirkpatrick’s model of an event’s evaluation, we have reported the ‘reaction’ of students, examiners and e-OSCE team using online questionnaires (14). Our research underpins the feasibility and effectiveness of e-OSCE by virtually assessing the final year medical students' clinical skills, while taking all precautions to reduce the risk of infection during the COVID-19 pandemic. Virtual OSCE and e-OSCE are interchangeable terms and, although initial reports have described early success, very few e-OSCE have been conducted in the COVID-19 era in the MENA region. Worldwide, the investigators have shown a mix of success and challenges from OSCEs that have been organized virtually. In the virtual OSCE, run by Courteille and colleagues, the authors used a virtual patient’s history via texts and videos, video-based examinations and some stations for data analytics (15). Likewise, Sartori et al., have reported encouraging results of a pilot virtual OSCE on the trainees for telehealth (16). Similarly, the effectiveness of virtual OSCEs has been documented in the nursing students and in a range of other medical disciplines (17) (18).
Technical feasibility depends on the availability of essential elements required for e-OSCE (19). In our college, the availability of an online platform, a stable internet, adequate number of computers, headsets, microphones and cameras allowed us to successfully plan and conduct e-OSCEs. A previous study has shown that the concurrent use of faculty as simulated patients and assessors was acceptable for both students and faculty (20). In our study, examiners simulated and role-played the given scenarios in all history stations. In contrast to the standard OSCE, the examiners rotated between student’s channels while students stayed constantly in their respective channels. In order to maintain security and confidentiality of the examination, all virtual movements of students were continuously monitored and recorded via MST. Another layer of security of the e-OSCE was provided by the synchronous monitoring by the examiners in all stations.
In our study, most of the survey responses from students, examiners and organizers recognized smooth organization of e-OSCE. However, we faced challenges in training examiners as well as students in short time. They were unfamiliar with the new format of e-OSCE. Several actions were taken to acquaint them and to minimize technology related issues. Hopwood et al., 2020 have recommended a minimum of three hours of training for students and examiners in order to be acquainted with their roles as well as the logistic of the OSCE (21). We conducted mock exams that helped us to develop a structured protocol for e-OSCE. The results of the mock exams were slightly different from the real exams and, by and large, there were issues about Internet, equipment familiarity and space for social distancing. Literature has underpinned the value of an initial training session that will help identify any major problems related with technology, equipment or format of the e-OSCE with a follow up training near the assessment time (22). On the same note, we not only organized early training sessions, in order to secure standardization in assessment, we also trained examiners of same stations in different panels to reduce interrater variability.
An outright benefit of the e-OSCE was its successful completion in less than half of the time required for standard OSCE. Based on our experience of conducting face-to-face OSCE, the average time needed to examine similar number of students on similar number of stations was approximately five hours. However, each e-OSCE cycle lasted for approximately two and half hours. Moreover, the use of electronic google forms, instead of paper-based sheets, for evaluations saved time spent on adding, auditing, and aggregating marks for each student. Additionally, the submission of electronic evaluation forms by examiners was allowed only when fully completed. Organizers would ensure that each examiner has submitted his grades for each student before the start of the next student’s evaluation. These features eliminated the potential for missing data. In this perspective, Pauline et al., have introduced an online OSCE management information system (OMIS) that integrates stations, items bank, OSCE planning and administration system and a results analysis software (23). The OMIS system offers an additional feature of hosting feedback facility which allows the students to receive timely evaluation on their performance. We did not use a customized ready-made platform for e-OSCE. Nevertheless, such platforms offer a range of innovative products that would be valuable for medical educators while assessing clinical competencies of medical students.
The remarks by e-OSCE team “e-OSCE is time saving, easier for both examiners and students, and save time when dealing with markings” endorsed our efforts. Although 83.6% of students and 94.1% of examiners acknowledged that e-OSCE ran smoothly, only 21.3% of students and 20% of examiners preferred e-OSCE over face to face OSCE. This may be due to the inability of e-OSCE to test physical examination and procedural skills. Some of the responses and comments such as “thank you for organizing OSCE”, “I can see your precious efforts in arranging and preparing the exam as if we are doing it physically in the college”, “I believe e-OSCE is an acceptable option in terms of crisis (pandemics) but in normal time I do prefer the traditional face to face” seem to support the utilization of e-OSCE as an attractive substitute during the pandemic and beyond. In the previous studies, learners who have participated in web-based clinical skill assessment have reported high satisfaction but they have demonstrated preference to face-to-face OSCE as they were concerned for not being able to ‘touch’ patients (24) (25). Our evaluations also showed similar concerns which reflect an essential element of human behaviours.
In our e-OSCE, we have assessed students’ communication, history taking skills and their problem-solving ability while physical examination and procedure skills were not part of the assessment. This important shift in the assessment of clinical skills would necessitate adjusting our future pedagogical strategies to accommodate new skills such as telemedicine consultation which has limited physical examination and focuses mainly on distant disease management (26). By this we would be preparing the future doctors for a changing health care delivery where telemedicine is becoming an essential means of delivering a cost-effective, timely and efficient care.
Another added advantage of using technology in e-OSCE was the option of reviewing the recorded exam sessions in case of complaint, dispute or incomplete submission of grades by examiners. Video recording during the OSCE has been reported in the literature to serve as means of facilitating the exam and be part of the exam itself (27). Videotaped OSCEs have considerable benefits including quality assurance. However, in the face-to-face OSCE, video recording would require equipment and expertise to obtain good quality recordings. In our study, students’ performance was monitored throughout e-OSCE using MST teleconferencing, without the need for additional equipment. After e-OSCE, the videos were downloaded and saved for future reference; recorded OSCE stations would give students the opportunity to watch and learn from their performance. This may also be useful for examiner training and virtual OSCE station development (27).
Incorporating technology in teaching and assessment has many advantages but, on the other hand, can pose several barriers. The challenges faced by e-OSCE team were mainly in the preparatory phase for designing, training, resources and securing precautionary measures against spread of COVID-19. Written responses such as “having an alternative platform to conduct OSCE exams was at first something farfetched, but we could achieve it as a team” substantiated the challenge faced by the e-OSCE organizing team. Although, the online platform “MST” was utilized by members of e-OSCE team to conduct synchronous clinical skill sessions, the notion of transferring the set-up of a traditional face-to-face OSCE to a virtual set-up required iterative online meetings and trials with the support of the IT team. Prettyman et al. have emphasized a close collaboration of e-OSCE team with the IT department to ensure the availability and proper function of the computer hardware and software as well as cloud-based systems needed for e-OSCE (27). Similar degree of collaboration occurred in our study which guaranteed the feasibility and effectiveness of e-OSCE. Previous research with web-based OSCE had shown that students were unsatisfied with the technology and have faced technical difficulties like dropped calls, poor video and audio quality (28). During our e-OSCE, internet interruptions were reported by students, examiners and organizers, yet these interruptions were transient and did not disturb the flow of the e-OSCE. All students were assessed as scheduled.
In our study, the students’ performance was insignificantly lower in e-OSCE than in the traditional face-to-face OSCE in the CoM across all clinical clerkships. Our results are not in conformity with the study by Lara et al., where the researchers did not find difference in mean scores and failure rates between the results of e-OSCE and face-to-face OSCEs (12). We cannot postulate any scientific reason for low scores in e-OSCE. At the same time, since there was insignificant variation, the reliability and validity of e-OSCE remains acceptable. In order to upskill examiners, structured faculty development programs to supervise, organize and to conduct e-OSCEs using technologies and digital platforms are essential (29) (30). In our study, most examiners applauded the planning and implementation of e-OSCE and admired the students’ skills in the digital realm. Conversely, students did not comment about the usage of innovative technology-enhanced assessment in e-OSCE. However, they showed minor concerns about the e-OSCE structure and training of examiners. This goes well with the understanding that today’s medical students are digitally native and did not perceive e-OSCE as an advanced intervention. As a quality control, a post-exam item analysis by the assessment committee did not find any issue related to the quality of contents and blueprints. Henceforth, e-OSCE proved feasible with a paper-less, synchronous, digital documentation, and a safe approach in COVID era.
The e-OSCE model used in our study could not evaluate the physical examination and procedural skills of the medical students. This factor limited the scale of assessment of clinical skills.