Assessing Clinical Reasoning Ability in Fourth-year Medical Students via an Integrative Group History-taking Workshop With an Individual Reasoning Teaching Programme

DOI: https://doi.org/10.21203/rs.3.rs-957725/v1

Abstract

Background: The most important factor in evaluating a physician’s competence is strong clinical reasoning ability, leading to correct principal diagnoses. The process of clinical reasoning includes history taking, physical examinations, validating medical records, and determining a final diagnosis. In this study, we designed a teaching programme to evaluate the clinical reasoning competence of fourth-year medical students.

Methods: We created five patient scenarios for our standardised patients, including haemoptysis, abdominal pain, fever, anaemia, and chest pain. A group history-taking workshop with individual reasoning principles was implemented to teach and evaluate students’ abilities to take histories, document key information, and arrive at the most likely diagnosis. Residents were trained to act as teachers, and a post-study questionnaire was employed to evaluate the students’ satisfaction with the training programme.

Results: A total of 76 students, five teachers, and five standardised patients participated in this clinical reasoning training programme. The average history-taking score was 64%, the average key information number was 7, the average diagnosis number was 1.1, and the average correct diagnosis rate was 38%. Standardised patients presenting with abdominal pain (8.3%) and anaemia (18.2%) had the lowest diagnosis rates. The scenario of anaemia presented the most difficult challenge for students in history taking (3.5/5) and clinical reasoning (3.5/5). The abdominal pain scenario yielded even worse results (history taking: 2.9/5 and clinical reasoning 2.7/5). We found a correlation in the clinical reasoning process between the correct and incorrect most likely diagnosis groups (group history-taking score, p=0.045; key information number, p=0.009 and diagnosis number, p=0.004). The post-study questionnaire results indicated significant satisfaction with the teaching programme (4.7/5) and the quality of teacher feedback (4.9/5).

Conclusions: We concluded that the clinical reasoning skills of fourth-year medical students benefited from this training programme, and the lower correction of the most likely diagnosis rate found with abdominal pain, anaemia, and fever might be due to a system-based teaching programme in fourth-year medical students; cross-system remedial reasoning training is recommended for fourth-year medical students in the future.

Introduction

Differential diagnosis is important for clinicians and involves a deeper higher order thinking process about the evaluation of history taking, physical examination, review of laboratory data, and diagnostic images that exceed memorisation, facts, and concepts.1 The ability to formulate a case-specific principal diagnosis from the list of possibilities is an important clinical skill because it serves an important role in diagnosis-related group (DRG) codes to assign a patient and healthcare billing. Koenemann et al. reported that clinical case discussions with peer-taught and physician-supervised collaborative learning formats could promote clinical reasoning in medical students.2 However, we used problem-based learning (PBL) in schools to train the clinical reasoning process for fourth-year medical students using a module system base that could easily restrict students’ thinking in the same organ system.

Assessing a medical student’s ability concerning differential diagnosis is important; however, there is no consensus on the most effective approaches to evaluate these reasoning skills. Many efforts have been made to develop a valid and reliable measure of clinical reasoning ability, including key feature questions and script concordance tests.3–4 Simulation-based testing methods have also been developed to meet the need for assessment procedures that are both authentic and well structured. Sutherland et al. allowed students to watch a video trigger and discuss their diagnostic reasoning with an examiner demonstrating that it could be assessed.5 Fürstenberg et al. developed a clinical reasoning indicators history-taking scale to quantitatively assess clinical reasoning indicators during history taking in undergraduate medical students,6 which was deemed suitable for the assessment of fourth-year medical students’ clinical reasoning ability.

The objective structured clinical examination (OSCE) evaluation has proven to be a reliable and valid method for assessing the six competencies defined by the Accreditation Council for Graduate Medical Education (ACGME) in surgery. The competencies assessed by a well-constructed OSCE include patient care and medical knowledge as well as skills like data synthesis and the ability to list differential diagnoses.7 This evaluation method was able to identify significant deficiencies in musculoskeletal examination skills and the diagnostic reasoning of fourth-year medical students based on principles of the hypothesis-driven physical examination.8 Using a group OSCE format makes it possible to assess the individual ability of a large number of students without the usual time and expense needed.9 As the fourth-year medical students had no clinical reasoning curriculum except PBL training, in this study, we aimed to train their clinical reasoning through a group history-taking workshop with standardised patients using scenarios that mimic clinical conditions and then creating individual students’ most likely diagnosis with supporting data to assess their clinical diagnosis ability.

Material And Methods

Problem identification and target needs of fourth-year medical students’ clinical reasoning ability before clinical practice

The curriculum at Mackay Medical College for fourth-year medical students is an integrated eight modules of divided organ systems based on the concise clinically relevant anatomical structure, functions, and behaviours of abnormal and diseased persons, as well as clinical knowledge and skills, including clinical reasoning training. In this systemic module, PBL is implanted and integrated into semesters, and how to assess the students’ clinical reasoning ability is our problem and target needs.

Educational objectives and assessment method

Accurate diagnostic reasoning is the fundamental basis for ensuring patient care and safety; thus, the development of diagnostic reasoning is a key component of UGY medical education. We created a group objective structured clinical examination (GOSCE) teaching programme for clinical reasoning at Mackay Medical College which is an elective pre-clerkship education course for fourth-year medical students, with the aim of training assessing students’ clinical reasoning ability. To evaluate clinical reasoning ability, Haring et al. developed an observation tool that consists of an 11-item observation rating form and a post encounter rating tool which are both feasible, valid, and reliable to assess students’ clinical reasoning skills in clinical practice.10 They also reported that by observing and assessing clinical reasoning during medical students’ history taking, general and specific phenomena could be used as indicators, including taking control, recognising, and responding to relevant information, specifying symptoms, asking specific questions that point to pathophysiological thinking, placing questions in a logical order, checking agreement with patients, summarising, and body language.11 We modified the methods that were used for the GOSCE workshop to include history taking and clinical reasoning each time we let four to five students visit a standardised patient with a clinical case to collect enough data, and then individually write down key information and make a correct differential diagnosis. For assessment of clinical reasoning, and to train their principal diagnosis ability, students were asked to list 15 items of key information for differential diagnoses and list the three most likely diagnoses, including the first.

Study setting of the GOSCE workshop

Five R1 residents of internal medicine were recruited as teaching faculty in Mackay Memorial Hospital and requested to recognise the educational content and assess the trainees’ GOSCE performance and provide immediate feedback. We created five clinical scenarios: haemoptysis, abdominal pain, fever, anaemia, and chest pain. The participants were divided into small groups consisting of four to five trainees per team. During the process of GOSCE training, the trainees were expected to perform history taking while gathering clinical information from the standardised patients as well as acknowledge the symptoms and signs of the disease in each scenario. Documenting key words from history taking which requires as many as 15 clues, is also crucial to reflect students’ comprehension of a patient’s problems. From the patient’s information, the trainees were asked to differentiate the diagnosis with associated status and finally reach their most likely diagnosis along with two tentative diagnoses. After GOSCE training, the residents who observed the trainees’ performance would be required to share their comments and experiences as well as resolve the trainee’s questions. The post-GOSCE questionnaire, which was designed to focus on the satisfaction of the teaching programme, provides an opportunity for the students to review and self-evaluate improvement in clinical reasoning and history taking through the GOSCE programme. The process and learning objectives of the GOSCE training programme for fourth-grade medical students are presented in Table1.

Table 1

Group OSCE workshop for M4 students individual clinical reasoning training program

Process

Subject Steps

Learning outcome focus

1. Include faculty

Training 5 residents as a teacher

Post-workshop teaching ability assessment

2. Select teaching materials

Create five clinical scenarios for group training

Scenarios including hemoptysis, abdominal pain, fever, anemia,

chest pain and body weight loss

3 Group OSCE with individual reasoning training

Use this workshop to train M4 student’s history taking and clinical reasoning

Trainees write down 15 key information for clinical reasoning, one most likely with two tentative diagnosis including reasons

4. Group feedback

Post-workshop immediate feedback and clinical reasoning discussion

Residents shared their experience in the class and resolve trainee’s questions

5. Questionnaire

Post-workshop feedback questionnaire

Post-assessment: questionnaire for review improvement

 

Participants and educational content

Fourth-grade medical students from Mackay Medical College were enrolled to participate electively in a system-based teaching programme. We also trained five residents from Mackay Memorial Hospital as teachers to evaluate the trainees’ performances based on GOSCE checklists and then provide post-GOSCE feedback. Additionally, five standardised patients (SPs) who simulated the symptoms and signs of the teaching scenarios were also able to offer post-study remarks.

The GOSCE was designed to provide an opportunity for fourth-grade medical students to perform history taking while gathering clinical information from the SPs as well as acknowledge the symptoms and signs of the disease in each scenario. Five patient scenarios were created as follows: lung cancer presenting with haemoptysis, acute pancreatitis presenting with abdominal pain, acute pyelonephritis presenting with fever, uterine myoma bleeding presenting with anaemia, and acute coronary syndrome presenting with chest pain. The case settings of haemoptysis and chest pain were regarded as system-based thinking logics, whereas those with abdominal pain, fever, and anaemia were multisystem-related. Each scenario highlighted different key words in the patient’s history, which could be clues for approaching the final and tentative diagnosis. The educational strategies for clinical reasoning in each scenario are presented in Table 2.

Table 2

Educational strategies of clinical reasoning in each scenario

Scenarios

Educational Strategies (1. Content; 2. Methods)

1.

Hemoptysis

1. Understanding the symptoms including vital signs, the color of hemoptysis, timing, coagulopathy and associated conditions.

2. Practice to differ lung cancer (the most likely diagnosis), pulmonary TB and bronchiectasis; and then to recommend initial examination such as CXR, CBC, PT, PTT, Platelet and sputum smear, culture (AFB) and cytology.

2.

Abdominal pain

1. Understanding the presentations including pain location, quality, provocation / palliative factors, region / radiation, timing and associated symptoms.

2. Practice to differ acute pancreatitis (the most likely diagnosis), acute cholecystitis and acute cholangitis; and then to recommend initial examination such as abdominal echo, white cell count, liver function test and amylase / lipase.

3.

Fever

1. Understanding the symptoms including fever pattern, exclude

upper airway, GI tract and GU tract infection and associated muscle, skin or autoimmune disease.

2. Practice to differ acute pyelonephritis (the most likely diagnosis), acute viral hepatitis and pneumonia; and then to recommend initial examination such as UA, white cell count, hepatitis work-up and CXR.

4.

Anemia

1. Understanding the etiology of anemia including poor production, destruction and blood loss.

2. Practice to differ uterus myoma bleeding (the most likely diagnosis), GI tract bleeding and Vitamin B12 deficiency; and then to recommend initial examination such as CBC, MCV, serum iron and ferritin, arrange UGI endoscopy and colonoscopy and consult GYN evaluation.

5.

Chest pain

1. Understanding the presentations including pain location, quality, provocation / palliative factors, region / radiation, timing and associated risk factors.

2. Practice to differ acute coronary syndrome (the most likely diagnosis), pleuritis and pneumothorax; and then to recommend initial examination such as12-lead EKG,cardiac enzyme, CXR and white cell count.

 

Assessment and feedback

In this study, we focused on four major components: (1) the trainee’s GOSCE score, (2) number of key information points, (3) diagnosis numbers and a correct most likely diagnosis rate, and (4) feedback questionnaire. The residents were capable of rating the GOSCE scores of trainees using the checklist, based on observation of the overall performance of each trainee. By collecting the record paper at the end of this teaching programme, we were able to accumulate the number of key information notes documented by each trainee. Meanwhile, from the record paper, we could check the number of diagnoses and whether they were correct or not. Moreover, a post-study questionnaire consisting of the degree of satisfaction with the GOSCE programme, self-evaluation of ability and difficulty in clinical reasoning, and history taking was issued. The rubrics and questionnaire were based on a 5-point Likert scale for level of satisfaction.

Statistical analysis

Data from the post-course feedback questionnaire and GOSCE scores are shown as the mean ± standard deviation (SD). The correlations observed between the correct diagnosis group and the incorrect diagnosis group were analysed using Student’s t-test. Statistical analyses were performed using SPSS 23.0 statistical package (SPSS, Chicago, IL, USA). All statistical analyses were based on two-sided hypothesis tests with a significance level of p < 0.05.

Results

A total of 76 fourth-year medical students participated in this study, and five Mackay Memorial Hospital R1 residents were recruited as teachers in this clinical reasoning training programme. Regarding the trainees’ score in GOSCE with individual reasoning workshop, the best score in the group history-taking scenario was chest pain (76.9%) and the worst was haemoptysis (50.1%), with an average GOSCE history-taking score of 63.5%. The number of key words the medical students were permitted to write down was between five and eight, and the average number was seven; fever had the best number of key words in the scenarios (8), with the worst for abdominal pain (5). The average diagnosis number in each scenario was one to two; and the correct “the most likely diagnosis” rate ranged from 8.3–87.5%, with chest pain as the best scenario and abdominal pain as the worst (Table 3).

Table 3 Trainees’ scores in GOSCE with individual reasoning workshop

Tn, Trainee’s number; Key Information Number, the written down number of 15 key words; Diagnosis Number, the accurate diagnosis number of three diagnosis answers.

Concerning the post-GOSCE and individual reasoning workshop feedback questionnaire, the overall satisfaction degree of the students with GOSCE was about 4.6/5 to 4.8/5. The teacher’s feedback and teaching ability were rated from approximately 4.7/5 to 4.9/5. The anaemia score of the participants’ self-evaluation in these five scenarios was the most difficult in history taking (3.4/5) and clinical reasoning (3.5/5). Self-evaluation ability in history taking was worse for the abdominal pain (2.9/5) and anaemia scenarios (2.8/5); the self-evaluation ability in clinical reasoning was also worst in the abdominal pain (2.7/5) and anaemia scenarios (2.8/5) (Table 4).

Table 4 Post GOSCE and individual reasoning workshop feedback questionnaire

The rubrics and questionnaire were based on the Likert scale for the Level of Satisfaction:

very satisfied (5); satisfied (4); unsure (3); dissatisfied (2); very dissatisfied (1)

$ Level of Agreement: strongly agree (5); agree (4); neither agree nor disagree (3); disagree (2); strongly disagree (1)

# Level of quality: excellent (5); very good (4); good (3); fair (2); poor (1)

Regarding ‘the most likely diagnosis’ after GOSCE and individual reasoning, we divided the participants into ‘correct the most likely diagnosis group (n=29)’ and ‘incorrect the most likely diagnosis group (n=47)’ and found significant differences between these two groups, including the GOSCE history-taking score (p=0.045), number of key words written down (p=0.009), and diagnosis numbers (p=0.004) (Table 5).

Table 5 The correlation between correct “the most likely diagnosis group” and incorrect “the most likely diagnosis group” in clinical reasoning (n=76)

 

Correct the most likely diagnosis group (n=29)

Incorrect the most likely diagnosis group (n=47)

P value

Group history taking score

66.8 ± 12.2

61.5 ± 10.5

0.045

Key information number

7.6 ± 1.9

6.4 ± 1.9

0.009

Diagnosis number

1.4 ± 0.6

0.9 ± 0.8

0.004

Discussion

Clinical reasoning skills may help students to better focus on the efficient history taking and physical examinations that are required for making a correct diagnosis.12 Donna et al. reported that a simulated clinic model which allowed one student to perform history taking and physical examination, and then following a small group collaboratively develop a prioritised differential diagnosis could support medical students’ basic and clinical science integration.13 In our elective individual clinical reasoning training programme for M4 medical students, a group OSCE was based on educational strategies including five key scenarios integrated with R1 residents as teachers enabled to facilitate the clinical reasoning learning before clinical practice. This clinical reasoning teaching and assessment programme demonstrated that the M4 students needed remedial reasoning training, the GOSCE workshop earned the students’ good satisfaction, and we also recorded these training courses as interactive e-learning education material for other medical students.

According to previous literature, Rencic et al. reported that most students enter clerkship with only poor (29%) to fair (55%) knowledge of key clinical reasoning concepts at US medical schools, thus developing clinical reasoning curricula to improve diagnostic accuracy and reduce diagnostic error is important.14 Bonifacino et al. reported that exposure to a clinical reasoning curriculum was associated with both superior reasoning knowledge and

written demonstration of clinical reasoning skills by third-year medical students on an internal medicine clerkship.15 Furthermore, generating explicit options improves clinical reasoning skills, and a longitudinal implementation of clinical reasoning in the curriculum would appear to be worthwhile.16 For our M4 clinical reasoning curriculum in the future, we may construct self e-learning education materials including cross-system symptoms and signs.

Does the OSCE score reflect the clinical reasoning ability of medical students? Park et al. demonstrated that the clinical reasoning score was significantly correlated with diagnostic accuracy and grade point average but not with OSCE score or clinical knowledge score, and that some students might attain a high OSCE score simply by asking and checking memorised items without adequate reasoning.17 In our study, we found that the ability to correct “the most likely diagnosis” is very important, due to its correlation with history taking, identifying key information, and reasoning ability (Table 5). The purpose of group OSCE is to increase more data of key information from the training group which can improve individual clinical reasoning, however, reasoning also depends on the background of individual knowledge and critical thinking ability.

Does the OSCE score of clinical reasoning reflect the ability of medical students’ performance? Falcone stated that third-year medical students are generally accurate in their ability to diagnose acute abdominal pain by OSCE, but the surgical faculty’s Likert-based assessments of students’ ability to analyse data do not correspond with OSCE performance; the length of exposure to patients before OSCE might be a confounding variable.18 Weinstein et al. successfully implemented a faculty development workshop for diagnosing and remediating clinical reasoning difficulties to help clinical teachers improve their skills.19 In our study, we created a teaching faculty including R1 residents, and the R1 teacher gave students 30 minutes immediate group feedback post workshop which earned high satisfaction.

This study has several limitations. First, it was performed in a single institution with only 76 fourth-grade students. Therefore, these results may not be generalisable to other institutions, which may have different clinical clerkship programmes and student evaluation systems. Second, the authors tested only five case scenarios, which may not be sufficient to generalise these conclusions across a wider population. Third, the GOSCE score used for the present analysis was based only on the group of patient encounters, not personal encounters.

In conclusion, our school uses a divided organ systemic module teaching programme which possibly restricts students’ cross-system reasoning ability. We found that the students’ reasoning was worse in the scenarios for abdominal pain, fever, and anaemia, so we included R1 residents as clinical reasoning teaching faculty and used group OSCE with individual reasoning training which demonstrated high satisfaction. We further created e-learning scenario videos to improve other students’ cross-system reasoning ability.

List Of Abbreviations

DRG, diagnosis-related group; PBL, problem-based learning; OSCE, objective structured clinical examination; ACGME, Accreditation Council for Graduate Medical Education; GOSCE, group objective structured clinical examination; SP, standardised patient

Declarations

Ethics approval and consent to participate:

This study was approved by the MacKay Memorial Hospital Institutional Review Board(21MMHIS281e), Taipei, Taiwan.

Consent for publication:

“Not applicable”

Availability of data and materials:

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Competing interests:

"The authors declare that they have no competing interests"

Funding:

There is no funding in this study.

Authors' contributions:

Dr. Lin substantial contributions to conception and design of the study, Dr. Lail and Dr. Cheng acquisition of data, or analysis and interpretation of data; Dr. Lin and Dr. Lain and Dr, Cheng drafting the article, Dr. Lin and Dr. Wu revising it critically for important intellectual content and final approval of the version to be published.

Acknowledgements:

"Not applicable"

References

  1. Cook CE, Décary S. Higher order thinking about differential diagnosis. Braz J Phys Ther. 2020;24(1):1–7.
  2. Koenemann N, Lenzer B, Zottmann JM, Fischer MR, Weidenbusch M. Clinical Case Discussions – a novel, supervised peer-teaching format to promote clinical reasoning in medical students. GMS J Med Educ. 2020 Sep 15;37(5):Doc48. doi: 10.3205.
  3. Hrynchak P, Glover TS, Nayer M. Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ. 2014;48:870–83.
  4. Humbert AJ, Besinger B, Miech EJ. Assessing clinical reasoning skills in scenarios of uncertainty: convergent validity for a Script Concordance Test in an emergency medicine clerkship and residency. Acad Emerg Med. 2011;18:627–34.
  5. Sutherland RM, Reid KJ, Chiavaroli NG, Smallwood D, McColl GJ. Assessing diagnostic reasoning using a standardized case-based discussion. J Med Educ Curric Dev. 2019 May 20;6:2382120519849411.
  6. Fürstenberg S, Helm T, Prediger S, Kadmon M, Berberat PO, Harendza S. Assessing clinical reasoning in undergraduate medical students during history taking with an empirically derived scale for clinical reasoning indicators. BMC Med Educ. 2020 Oct 19;20(1):368.
  7. Sidhu RS, Grober ED, Musselman LJ, Reznick RK. Assessing competency in surgery: where to begin? Surgery. 2004;135:6–20.
  8. Stansfield RB, Diponio L, Craig C, Zeller J, Chadd E, Miller J, et al. Assessing musculoskeletal examination skills and diagnostic reasoning of 4th year medical students using a novel objective structured clinical exam. BMC Med Educ. 2016 Oct 14;16(1):268.
  9. Elliot DL, Fields SA, Keenen TL, Jaffe AC, Toffler WL. Use of a group objective structured clinical examination with first-year medical students. Acad Med. 1994 Dec;69(12):990–2.
  10. Haring CM, Klaarwater CCR, Bouwmans GA, Cools BM, van Gurp PJM, van der Meer JWM, et al. Validity, reliability feasibility of a new observation rating tool and a post encounter rating tool for the assessment of clinical reasoning skills of medical students during their internal medicine clerkship: a pilot study. BMC Med Educ. 2020 Jun 19;20(1):198.
  11. Haring CM, Cools BM, van Gurp PJM, van der Meer JWM, Postma CT. Observable phenomena that reveal medical students' clinical reasoning ability during expert assessment of their history taking: a qualitative study. BMC Med Educ. 2017 Aug 29;17(1):147.
  12. Higgs J, Jones MA, Loftus S, Christensen N. Clinical reasoning in the health professions. 3rd ed. Edinburgh: Elsevier (Butterworth Heinemann); 2008.
  13. Williams DM, Bruggen JT, Manthey DE, Korczyk SS, Jackson JM. The GI simulated clinic: a clinical reasoning exercise supporting medical students' basic and clinical science integration. MedEdPORTAL. 2020 Aug 5;16:10926.
  14. Rencic J, Trowbridge RL Jr, Fagan M, Szauter K, Durning S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship directors. J Gen Intern Med. 2017 Nov;32(11):1242–6.
  15. Bonifacino E, Follansbee WP, Farkas AH, Jeong K, McNeil MA, DiNardo DJ. Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study. Diagnosis (Berl). 2019 Jun 26;6(2):165–72.
  16. Harendza S, Krenz I, Klinge A, Wendt U, Janneck M. Implementation of a clinical reasoning course in the internal medicine trimester of the final year of undergraduate medical training and its effect on students' case presentation and differential diagnostic skills. GMS J Med Educ. 2017 Nov 15;34(5):Doc66.
  17. Park WB, Kang SH, Lee YS, Myung SJ. Does objective structured clinical examinations score reflect the clinical reasoning ability of medical students? Am J Med Sci. 2015 Jul;350(1):64–7.
  18. Falcone JL, Watson GA. Differential diagnosis in a 3-station acute abdominal pain objective structured clinical examination (OSCE): a needs assessment in third year medical student performance and summative evaluation in the surgical clerkship. J Surg Educ. 2011 Jul-Aug;68(4):266–9.
  19. Weinstein A, Gupta S, Pinto-Powell R, Jackson J, Appel J, Roussel D, et al. Diagnosing and remediating clinical reasoning difficulties: a faculty development Workshop. MedEdPORTAL. 2017 Nov 6;13:10650. doi: 10.15766.