The structured oral examination (SOE) has been a standard of medical student education for many years. Arguments for their long-standing use include the potential to evaluate higher-order processing and conceptual organization [1,2] .Examiners are able to ask students a series of related questions that can test both their knowledge base and application ability to a clinical setting [3,4,5]. Additionally, students are able to receive immediate feedback on their responses, and can receive questions tailored to their abilities and needs [4,6].
However, the use of SOE has fallen out of favor in recent years with criticisms regarding low reliability, rater bias, lack of standardization within and between SOE, the cost effectiveness of such exams, and substantial time requirements [9-14]. Additionally, there is anecdotal evidence that SOE can cause unnecessary stress and underperformance on the exams. Given the variety of concerns, most medical licensing boards in the United States have abandoned the SOE over the last 30 years.
There is tremendous variation in what constitutes the SOE, which can vary by both format and number of examiners. Some examination types include (1) interviewing examinees by quizzing on general topics, (2) asking questions regarding diagnosis and treatments plans in a clinical style, (3) utilizing a cognitive style that involves problem solving around specific cases, and (4) role playing where the student assumes various “role” to assess knowledge base. Historically, the Obstetrics and Gynecology Department at this particular institution used a combination of cognitive and clinical styles to assess third year medical students in the OB/GYN clerkship.
Surprisingly, there have been very few studies comparing SOE to other assessment modalities. The most recent literature focuses primarily on subspecialties taking board examinations, but we have yet to find a study that compares or assesses the potential of oral examinations as a medical school teaching tool [16]. As mentioned previously, there have been studies that test the reliability and validity of scoring oral exams [10,17,18]. Additionally, the Department of Obstetrics and Gynecology at this instituation has incorporated SOE as an integral part of the evaluation of medical education of third year OB/GYN clerkship students for many years with no real assessment of whether this provided any value in the education of students.
The goal of this study was to determine the usefulness of the SOE in medical student assessment by evaluating the exam’s relationship to student performance on other required clerkship examinations. These included the Observed Standardized Clinical Examination (OSCE), which uses actors to simulate patient encounters in order to assess trainee clinical reasoning, and the NBME subject examination, a national, discipline-specific, standardized test. This would provide useful data to assist the OB/GYN department in determining whether the oral exam should continue to be an integral part of the third year medical student curriculum.