The Study on the General Medical Students’ Clinical Competency in Problem Solving, Communication Skills, Procedure, History and Physical Examination and Critical and Non-Critical Indicators in Objective Structured Clinical Examination (OSCE)

DOI: https://doi.org/10.21203/rs.3.rs-206569/v1

Abstract

Introduction:

Clinical education is the basic pillar and heart of medical education. In fact, it is one of the most important manifestations of teaching and learning in professions related to medical sciences, which leads to the clinical competency of learners. Assessing new physicians before entering the field of clinical activities can be a reliable criterion for evaluating the quality of their clinical skills.The current study was done to investigate the knowledge, practice, and general clinical competency of general medicine graduates before entering the field of clinical activities. The study was in line with according to the document of the minimum competency expected from general practitioners in Iran.

Methods

In this descriptive cross-sectional study, the scores of different stations of the national Objective Structured Clinical Examination (OSCE), which be held at the end of the general medicine course in Iran, were collected at Mashhad University of Medical Sciences. These scores were subdivided into four specific areas and two critical and non-critical indicators of capability were gathered from the exam assessors' checklists. Totally, 266 students who participated in six periodicities of clinical competency tests at the end of the general medicine course at Mashhad University of Medical Sciences were included in the study by the census method. The clinical competency of general medicine graduates assessed in the areas of problem-solving, communication skills, practical action (procedures and critical skills), taking the history, performing physical examinations, and critical and non-critical indicators by the OSCE. Data were analyzed using descriptive statistics (frequency, mean and standard deviation) and inferential statistics including independent t-test, one-way and two-way analysis of variance by using SPSS.

Results

The results showed that the effect of different areas of the OSCE (F(3,5652) = 7.022 and P = 0.001) and participants' performance based on their critical and non-critical indicators (T = 1.976 and P = 0.04) are significant with 95% confidence interval.

Conclusion

The OSCE improves the standards of clinical competencies of new physicians and can make beneficial changes in clinical education at medical schools.

Introduction:

Today one of the most critical issues in the higher education systems is quality. Educational evaluation in the sense of "judging the learner's knowledge and performance" has a history of nearly five centuries (1). Creating an appropriate and efficient mechanism for evaluating students is one of the solutions to improve the quality of education. Practically, the importance of evaluation lies in determining the quality of what we want to achieve (2). Notably, Clinical education is the basis of medical education and undoubtedly is one of the most important manifestations of education and learning in medical sciences professionalism, which leads to the clinical competence of learners. In past decades, various studies have been conducted in the field of medical sciences’ clinical educations and several approaches have been proposed, including the planned clinical experience approach. The main purpose of applying this approach is to ensure that learners' clinical learning activities are consistent with the educational goals of the course. This approach attempts to plan each of the learners' clinical activities as a clinical experience focused on a definite goal (3).

Clinical competencies in real-world scenarios are defined as achievements at a pre-determined level of effectiveness, so evaluating the students before entering the clinical wards can be a reliable criterion for assessing the quality of their clinical skills (4). Competency-based education, as a result-oriented education paradigm, significantly changes the students' education pathway; meanwhile, the Objective Structured Clinical Examination (OSCE) is considered as a paradigm for assessing clinical competence (5, 6). According to Miller’s Pyramid, clinical competency consists of 4 levels including “Knows, knows How, Shows How and Does”, and the skill test is used for “Shows How” or performance competency (7). According to the studies which have been evaluated the clinical competency and performance of students and assistants in various disciplines, based on Miller's pyramid, a set of assessment methods, especially clinical skills and competency assessment methods, can significantly predict student performance after graduation (8). On the other hand, testing style may improve students' learning and examining approaches and enable them to retrieve information in the future so that they have more understanding and reflection on clinical sciences and the application of basic sciences at the patient's bedside (9). Continuous formative assessment (CFA) with appropriate feedback is a pillar of clinical education and learning; therefore the OSCE can be used to provide feedback to students about their clinical skills, which aim is improving student performance and helping them overcome their weaknesses (10).

Students in the clinical fields, including medical students, are more likely to engage in tests which unable to assess the expected abilities of a physician, in most cases. Evaluating the performance of medical students, especially evaluating their clinical competencies, is one of the main tasks of medical universities. Obviously, assessing clinical competencies is a complex process and a combination of different evaluation stages in which students' skillful performance, the correct use of clinical skills for solving patient problems, and their performance in the implementation of treatment and care programs are evaluating.

We are striving to assess clinical competency by using valid and reliable licensing methods for physicians who will be entering the clinical field; However, behavioral evaluation is too difficult and each method has its own limitations, but according to the given background of research and experience of other countries, OSCE seems to be a valid method for assessing the clinical competence of applicants (11). OSCE is a type of clinical competency assessment test that is organized objectively in the form of different stations. Questions are designed with the purpose of clinical competency assessment and students are assessed under the same standard conditions. This test is a tool for an objective and fair assessment of medical students' clinical competencies which widely used in medical education around the world (12). In another word, OSCE is a reliable assessment method for assessing students' clinical examination skills which have been developed to effectively measure the goals and is now widely accepted as a tool for assessing the medical competence of medical students (1315).

Due to the fact that the clinical competency tests at the end of the medical course have been considered and emphasized by Iran's Ministry of Health and Medical Education since 2015, held nationally, and registration in the residency course and graduation of general medical students is subject to obtaining the quorum of acceptance in these exams; so, the promotion of training and intra-departmental evaluations can have a significant impact on the success of general medical students of the universities in the above exams. Therefore, since the most important purpose of assessment and evaluation is to improve the quality of educational programs, and as mentioned, one of the effective methods of evaluation in medical education is the OSCE, the present study aims to review and evaluate the knowledge, practice and clinical competence of general medicine graduates before entering the field of clinical activities, and according to the document of the minimum competency expected from general practitioners and the axes of competencies of general medical graduates in Iran, in the areas of problem solving, communication skills, practical action (procedures and critical skills), taking the history, performing physical examinations, and critical and non-critical indicators, were performed in the medical faculty of Mashhad University of Medical Sciences in 2018 and 2019.

Methods:

The OSCE includes a series of test stations where students performing specific clinical skills such as taking a history, performing a physical examination, interpreting a laboratory/Para clinical test result, or performing a clinical procedure (16, 17). At each station, the supervisor evaluates the students' performance by using a specific checklist that includes the required standards of students' essential skills (18). An Actor-patient provides verbal and behavioral responses to students who arrive at the station for evaluation, to create the same clinical experiences at each station (19). Such an approach can practically provide, performance appraisal and assurance of the achievement of minimum professional standards for future physicians, and test the student's ability by interacting with standard supervisors and patients in clinical skills centers (20). Iranian medical students are assessed for clinical skills in the areas of problem solving, communication skills, practical action (procedures and critical skills), taking the history, and performing physical examinations. The questions of this exam are designed in the form of stations and students are tested in terms of clinical competence by the assessor while performing a "complex clinical task" in the stations. It is necessary to explain that clinical competence refers to the prudent application of technical and communication skills, knowledge, clinical reasoning, emotions, and values in clinical settings. Clinical competence is the capability to perform an activity that consists of knowledge (coded applied knowledge and experience-based tacit knowledge), skill (technical and cognitive), and conceptual ability. This capability is habitual, stable, task-dependent, observable and measurable, independent, knowledge-based and context-dependent, and is a key component of the medical profession. In the current study, clinical competence refers to participants' performance in terms of problem solving, communication skills, practical action (procedures and critical skills), taking the history, and performing physical examinations. Problem solving is the recognition and application of knowledge and skills that lead to correct responses to the situation or achieving the desired goal by the learners. The key point of problem solving is applying the knowledge and skills already learned, in new situations. Decision-making, reasoning, and problem solving skills are that the learner must be able to identify a problem and its dimensions while facing it, have the ability to collect and evaluate relevant information from the best available sources, identify and evaluate different solutions, be able to estimate the probable consequences of each one, and finally choose the most appropriate option according to the uncertainty conditions when making a decision. To make the final decision, he or she must be able to integrate this capability with his or her information in other areas, such as the priorities and values accepted by clients and society, as well as the cost-effectiveness of possible solutions. Communication skills that exist in both verbal and non-verbal types can be categorized into skills that are necessary for interpersonal, work-related, and correspondence situations. These learnable skills will improve relationships the quality of life in different situations. Interpersonal communication is an exchange process, purposeful, multidimensional, irreversible, and possibly inevitable. A medical student must have the ability to communicate effectively with patients, patient companions, and colleagues. Also, he must be able to demonstrate his competence in communicating in all areas orally, in writing, electronically, or by telephone. Procedure is a treatment or medical operations (21). The general medicine graduate must have the necessary competence in a wide range of clinical skills, perform practical procedures, and conducting laboratory tests, according to the set standards. Taking a history and performing the examination is the knowledge of understanding and recognizing medical signs and symptoms of the disease, which can be recognized by ordinary five senses without the need of special medical equipment or on the patient's bedside and can be diagnosed by a physician on the patient’s first examination. It should be noted that taking a clinical history is the most important part of the examination.

In the current descriptive cross-sectional study, the data of 266 general medical students in the last in six periodicities of the national clinical competency test which conducted in May 2018, July 2018, August 2018, November 2018, February 2019, and May 2019, were collected by census sampling method at Mashhad University of Medical Sciences. Obviously, the data were extracted by analyzing the scores of the different stations of the exams from recorded checklists of the assessors, the confidentiality of information was ensured. It should be noted that the clinical competency test is conducting nationally, simultaneously, and centrally in qualified centers in the health areas of Iran. These centers are accredited according to the standards approved by the General Medical Board and by the Secretariat of the General Medical Education Council.

We assessed, compared and evaluated the knowledge, practice, and clinical competency of the general medical students before entering the field of clinical activities in terms of problem solving, communication skills, practical actions (procedures and critical skills), taking the history, performing the physical examination, and critical and non-critical indicators. The data were analyzed by using descriptive statistics (frequency, mean and standard deviation) and inferential statistics including independent t-test and one-way and two-way analysis of variance through SPSS.

Results:

Table 1 indicates the descriptive statistics and results of the independent t-test to determine the differences between critical and non-critical indicators. Also, shows the one-way analysis of variance to investigate the differences in terms of problem solving, communication skills, procedure, History and physical examination in the OSCE.

Table 1

The results of independent t-test to evaluate the difference between critical and non-critical indicators and ANOVA test to investigate the OSCE sections.

Section

Frequency (N)

Mean

SD

Min. Score

Max. Score

T

F

P-value

Problem solving

1143

15.89

3.64

0

20

-

7.022

0.001

Communication skills

377

16.07

3.01

2

20

Procedure

2367

15.97

3.41

0

20

History and physical examination

1769

15.52

3.33

3

20

Critical indicators

3767

15.76

3.48

0

20

1.976

-

0.04

Non-critical indicators

1889

15.94

3.28

0

20

Total

5656

15.82

3.41

0

20

-

The findings indicated that the impact of the sections including problem solving, communication skills, procedure, History and physical examination (P = 0.001 and F (3,5652) = 7.022), as well as critical and Non-critical indicators (P = 0.04 and T = 1.976) were significant with 95% confidence in OSCE.

Table 2

The Bonferroni post hoc test results to investigate the relationship between the OSCE sections.

The Section

Other Sections

Mean Difference

Standard Error

P-value

Problem solving

Communication skills

-0.17500

0.20247

1.000

Procedure

-0.08129

0.12279

1.000

History and physical examination

0.37302*

0.12938

0.024

Communication skills

Problem solving

0.17500

0.20247

1.000

Procedure

0.09372

0.18904

1.000

History and physical examination

0.54803*

0.19338

0.028

Procedure

Problem solving

0.08129

0.12279

1.000

Communication skills

-0.09372

0.18904

1.000

History and physical examination

0.45431*

0.10714

0.001

History and physical examination

Problem solving

-0.37302*

0.12938

0.024

Communication skills

-0.54803*

0.19338

0.028

Procedure

-0.45431*

0.10714

0.001

Based on the Post-hoc test, the difference between the means was investigated by using the Bonferroni test. According to the results of the tests, there is a significant difference between the following sections: problem solving and History and physical examination (p = 0.024), communication skills and history and physical examination (p = 0.028), and procedure and history and physical examination (p = 0.001) (Table 2).

Furthermore, the two-way analysis of variance was used to investigate the interaction between critical and non-critical sections of the OSCE. The first factor is the sections of the OSCE which consist of 4 levels including problem solving, communication skills, procedure, and history and physical examination. The second factor of the OSCE includes two levels (critical and non-critical indicators). The results showed that the interaction and the combined effect of the above-mentioned factors were not significant (F(3,5648) = 1.961 and P = 0.118).

Discussion:

One of the main challenges in medical education is the unclear expectations and competencies that graduates should acquire at the end of general medicine. Obviously, we face different approaches in educational planning. What meets the needs of medical faculties and can resolve the in competencies pointed out by medical educational professions is the Outcome-Based Education (OBE) approach. In OBE, attention is paid to the capability and the final output of the educational program; furthermore, the basic skills that the learners need to have acquired at the end of the course being considered. Exit-Outcome capability is a performance that is defined as the learners' mastery level in the program. Competencies and capabilities typically include all three components of knowledge, skills and attitudes, and it should be noted that knowledge is not merely considered as a competency. The important tip is that these competencies and capabilities are also measurable. Several advantages have been known for the OBE approach such as Relevance of the educational program to future job tasks, defining the goals and mission of the university in general practitioner training, being acceptable, clarity and comprehensibility, make learners responsible for learning, flexibility, providing appropriate guidance for evaluation, group participation in planning, a suitable tool for evaluation, the possibility of program continuity and its relationship with higher-level programs (21). It is obvious that every country has its own rules for issuing medical licenses, but in any case, it is certain that in order to obtain a license, practitioners must prove their competence. In the beginning, the OSCE's experience was somewhat limited to the United States, Canada, the United Kingdom, and Australia, but today some other countries such as Taiwan, Korea, and Japan require medical graduates to take a medical licensing exam, also other reputable medical schools have similar rules (20). As explained, OSCE is a convenient method for evaluating medical students that can overcome the deficiencies and limitations of traditional assessment methods (22). The study on 232 students participating in the Group OSCE (GOSCE) indicated that students and clinical educators considered this test as a valuable experience and believed that the test would help students to identify gaps, so they share their knowledge and skills among group members and peers (10). Marshall and Harris, in their study, concluded that OSCE is a very effective method in assessing the clinical competence of medical imaging students (radiographers) (23). Also, Nazzawi's findings suggest that OSCE is a significant and equitable tool for assessing clinical skills (24). Saunders et al. stated that 89% of the 332 participants in their study believed OSCE would lead to mastery of clinical skills (25). The study conducted by Hsieh et al. showed that designing a tool for measuring skills, provides an opportunity to understand better and apply the outcomes of clinical practice in medical students. Actually, different types of cases that are portrayed in the OSCE are common problems that students face in clinics or hospitals (26). Furthermore, the OSCE provides an excellent opportunity to improve the curriculum along with assessing students' competencies in the essential areas (27). Studies have been indicated that OSCE is an effective tool for assessing areas that are critical in the health professionals' performance, such as the ability to receive information from the patient, making communication, data interpretation, and problem solving (28).

In Korea, the National Health Personnel Licensing Examination Board (NHPLEB), which is similar to the National Board of Medical Examiners (NBME) in the USA, is responsible for issuing medical licenses in skills and attitudes exams. The Korean Society of Medical Education has been proven the OSCE as a medical license in Korea and believes that a written examination is not even appropriate and adequate to examine medical skills and attitudes of medical school graduates. There is a center in Seoul, Korea in which two sets of parallel stations are offered for the OSCE that totally includes 12 stations. Procedural techniques and skills (such as simple sutures, blood sampling, and blood pressure measurement) were tested in six short stations, and in the form of six long stations participants are confronted with standardized patients, so clinical skills such as communication skills and clinical reasoning are assessed. In each station, there is a staff who is a Science Committee member or a trained physician with a structured checklist, and applicants evaluate for a variety of competencies such as Communication Skills, Interviewing Skills, History-taking, Brief Physical Examination, and Ordering Laboratory Tests (11). Similarly in Taiwan, the OSCE Committee decided to include this exam in the Medical Curriculum to assess the clinical competencies of medical students (12). In the study by Yang et al., the qualifications of the residents in the 6 main cores were assessed in the form of OSCE. This study determined a significant difference in the performance of residents between different aspects of the core clinical competencies (F = 49.8 and P < 0.05). Residents had the highest acceptance rate in interpersonal and communication skills (81%), while the lowest acceptance rate was observed in the professionalism aspect (53%) (20). The alignment of Spanish medical school curricula with the European Higher Education Area (EHEA) caused the curriculum design to focus on eligibility. As qualifying involves achieving a predetermined level of efficiency in real-world scenarios, students need to gain more realistic healthcare experiences. Until recently, medical education in most medical schools only assessed learners' knowledge, while the medical education program had to be based on defined competencies. The main goals of the Spanish program included acquiring basic skills in taking the history, performing the physical examination, making reports, and amplifying clinical reasoning (4). According to Kamarudin et al., studies show that OSCE has excellent face validity and construct validity so that both students and trainers recommended it as a clinical assessment tool. The OSCE is designed to assess the clinical skills of final year medical students and gradually have been replaced by the long case assessment to investigate the psychomotor domains of students in medical schools all over the world, over 30 years. Moreover, their study indicated that the OSCE can be used for procedural and communication skills assessment (29). In addition, studies have shown that providing feedback during clinical training can improve interview and communication skills, physical examination skills, procedural skills, problem-based learning, team building, and personal and professional behaviors (30). Also, Simulated Patients can be used in clinical practice to assess students' empathy (31). Wimmers and Schauer state that the OSCE consists of a set of short clinical encounters with standardized patients as part of the California Consortium of the Assessment Clinical Competence (CCACC), as a clinical performance exam at UCLA, in which the four main skills of clinical practice being assessed including history taking, physical examination, information sharing, and patient-provider interaction (32). Although the studies indicated that training and discussion among examiners are very necessary to reduce the differences in the scores that assessors consider for the OSCE (33), it is hoped that OSCE improves standards of physicians' qualifications and make beneficial changes in clinical education at medical schools.

Conclusion

Training expert physicians is one of the most important goals of medical universities and faculties, therefore assessing the clinical competency of new physicians is a very complex process that leads to professional physicians and general practitioners. However, according to the background of the studies and experiences of other countries, it seems that OSCE is a valid method as a paradigm for assessing clinical competency. Although more investigations and studies need to determine OSCE’s advantages, disadvantages, limitations and compare it with other similar approaches; Moreover, further studies may lead to finding a better approach in assessing the clinical competencies of new physicians who want to enter the field of clinical activities at the end of studying medicine.

List Of Abbreviations

OSCE: Objective Structured Clinical Examination

CFA: Continuous formative assessment

OBE: Outcome-Based Education

GOSCE: Group OSCE

NHPLEB: National Health Personnel Licensing Examination Board

NBME: National Board of Medical Examiners

EHEA: European Higher Education Area

CCACC: California Consortium of the Assessment Clinical Competence

Declarations

Ethics approval: This study was approved by Mashhad University of Medical Sciences Ethics Committee (IR.MUMS..REC.1398.152). Informed consent was obtained from all participants. All methods were carried out following relevant guidelines and regulations.

Consent to participate: No applicable

Consent for publication: Not applicable

Availability of data and material: The datasets generated and/or analysed during the current study are not publicly available due to maintaining the confidentiality and privacy of data but are available from the corresponding author on reasonable request.

Competing interests: The authors declare that they have no competing interests

Funding: The study was funded by Mashhad University of Medical Sciences, Mashhad, Iran.

Authors' contributions: All authors reviewed and accepted the manuscript. H.M: wrote the study proposal, conducted the idea of the study, data analysis, and main reviewer of the study. A.E: wrote the study proposal, conducted the idea of the study, supervisor, data interpretation, data gathering. AA.M: data interpretation and wrote the main manuscript.

Acknowledgements: Not applicable

References

  1. Saif A. Measurement, assessment and educational evaluation. Tehran. Doran Press; 2007.
  2. Bazargan A. Introduction to assessing quality in higher medical education in Iran: Challenges and perspectives. Quality in higher education. 1999;5(1):61-7.
  3. Haghani F, Alavi M. An Introduction to some new approaches in clinical education. Iranian Journal of Medical Education. 2011;10(5).
  4. Soler-González J, Buti M, Boada J, Ayala V, Peñascal E, Rodriguez T. Do primary health centres and hospitals contribute equally towards achievement of the transversal clinical competencies of medical students? Performance on the Objective Structured Clinical Examination (OSCE) in competency acquisition. Atencion primaria. 2016;48(1):42-8.
  5. Szasz P, Louridas M, Harris KA, Grantcharov TP. Strategies for increasing the feasibility of performance assessments during competency-based education: subjective and objective evaluations correlate in the operating room. The American Journal of Surgery. 2017;214(2):365-72.
  6. Ataro G. Methods, methodological challenges and lesson learned from phenomenological study about OSCE experience: Overview of paradigm-driven qualitative approach in medical education. Annals of Medicine and Surgery. 2020;49:19-23.
  7. Miller GE. The assessment of clinical skills/competence/performance. Academic medicine. 1990;65(9):S63-7.
  8. Rethans J, Norcini J. Baron/Maldonado M, Blackmore D, Jolly BC, Laduca T, Lew S, Page G, Southgate L. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36:901-9.
  9. Southwick F, Katona P, Kauffman C, Monroe S, Pirofski L-a, Del Rio C, et al. Commentary: IDSA guidelines for improving the teaching of preclinical medical microbiology and infectious diseases. Academic Medicine. 2010;85(1):19-22.
  10. Sulaiman ND, Shorbagi SI, Abdalla NY, Daghistani MT, Mahmoud IE, Al-Moslih AM. Group OSCE (GOSCE) as a formative clinical assessment tool for pre-clerkship medical students at the University of Sharjah. Journal of Taibah University Medical Sciences. 2018;13(5):409-14.
  11. Lee Y-s. OSCE for the medical licensing examination in Korea. The Kaohsiung Journal of Medical Sciences. 2008;24(12):646-50.
  12. Huang YS, Liu M, Huang CH, Liu KM. Implementation of an OSCE at Kaohsiung Medical University. The Kaohsiung journal of medical sciences. 2007;23(4):161-9.
  13. Vickers NJ. Animal communication: when i’m calling you, will you answer too? Current biology. 2017;27(14):R713-R5.
  14. Davis MH. OSCE: the Dundee experience. Medical teacher. 2003;25(3):255-61.
  15. Townsend A, Mcllvenny S, Miller C, Dunn E. The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance. Medical education. 2001;35(9):841-6.
  16. Smee S. Skill based assessment. Bmj. 2003;326(7391):703-6.
  17. Boursicot K, Roberts T. How to set up an OSCE. The clinical teacher. 2005;2(1):16-20.
  18. Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Medical education. 2004;38(2):199-203.
  19. Adamo G. Simulated and standardized patients in OSCEs: achievements and challenges 1992-2003. Medical teacher. 2003;25(3):262-70.
  20. Yang Y-Y, Lee F-Y, Hsu H-C, Huang C-C, Chen J-W, Cheng H-M, et al. Assessment of first-year post-graduate residents: usefulness of multiple tools. Journal of the Chinese Medical Association. 2011;74(12):531-8.
  21. Authors AGo. Capabilities of General Medicine Graduates of Tehran University of Medical Sciences [persian]. 2011.
  22. Taylor D, Quick S. Students' perceptions of a near-peer Objective Structured Clinical Examination (OSCE) in medical imaging. Radiography. 2020;26(1):42-8.
  23. Marshall G, Harris P. A study of the role of an objective structured clinical examination (OSCE) in assessing clinical competence in third year student radiographers. Radiography. 2000;6(2):117-22.
  24. Al Nazzawi AA. Dental students' perception of the objective structured clinical examination (OSCE): The Taibah University experience, Almadinah Almunawwarah, KSA. Journal of Taibah University Medical Sciences. 2018;13(1):64-9.
  25. Saunders A, Say R, Visentin D, McCann D. Evaluation of a collaborative testing approach to objective structured clinical examination (OSCE) in undergraduate nurse education: A survey study. Nurse education in practice. 2019;35:111-6.
  26. Hsieh M-C, Cheng W-C, Chen T-Y. Objective Structured Clinical Examination (OSCE) including critical simulation: Evaluation of medical student competence. Tzu Chi Medical Journal. 2014;26(1):40-3.
  27. Baharin S. Objective structured clinical examination (OSCE) in operative dentistry course-its implementation and improvement. Procedia-Social and Behavioral Sciences. 2012;60:259-65.
  28. Mitchell M, Henderson A. Groves M, Dalton M, Nulty D. The objective structured clinical examination (OSCE): Optimising its value in the undergraduate nursing curriculum. Nurse Education Today. 2009;29(4):398-404.
  29. Kamarudin MA, Mohamad N, Halizah MNABH, Yaman MN. The relationship between modified long case and objective structured clinical examination (OSCE) in final professional examination 2011 Held in UKM Medical Centre. Procedia-Social and Behavioral Sciences. 2012;60:241-8.
  30. Deane RP, Joyce P, Murphy DJ. Team Objective Structured Bedside Assessment (TOSBA) as formative assessment in undergraduate Obstetrics and Gynaecology: a cohort study. BMC medical education. 2015;15(1):1-12.
  31. Zhou Y, Hartemink AE, Shi Z, Liang Z, Lu Y. Land use and climate change effects on soil organic carbon in North and Northeast China. Science of the Total Environment. 2019;647:1230-8.
  32. Wimmers PF, Schauer GF. Validating OSCE Performance: The Impact of General Intelligence. Health Professions Education. 2017;3(2):79-84.
  33. van der Want AC, Bloemendaal PM, van der Hage JA. Examiners’ Perceptions in Surgical Education: The Blind Spot in the Assessment of OSCEs. Journal of Surgical Education. 2020.