Clinical reasoning and Basic science Knowledge: Assessing the Relationship in Medical Students

DOI: https://doi.org/10.21203/rs.3.rs-1976175/v1

Abstract

Background

various studies have investigated the impact of basic sciences knowledge on clinical diagnosis for skilled and novice physicians but, it is still a matter of controversy why and how clinical diagnosis and clinical reasoning are related to basic sciences knowledge.

Objective

This study aimed to explore the correlation between basic science knowledge with clinical reasoning skills by evaluating the correlation between the scores of the basic sciences course with the score of the clinical reasoning exam.

Methods

A clinical reasoning exam was used to evaluate the clinical reasoning skills of all medical interns of two internal medicine rotations. Scores were evaluated for correlation with the basic sciences course scores and basic sciences national comprehensive exam results. statistical analysis was performed using an independent t-test, stepwise regression, and Pearson correlation coefficient.

Results

All of the basic sciences course scores and basic sciences national comprehensive exam results correlated positively significant with the total score of the clinical reasoning exam and the score of the clinical reasoning exam were most closely correlated with the average score of the basic sciences course rather than to each course individually. The only predictive factor for clinical reasoning skills is the average score of basic sciences course.  No significant gender differences were observed in the data analysis.

Conclusion

The results of this study suggest that learning basic sciences knowledge has positive correlations with clinical reasoning skills. However, this correlation is small to moderate based on the course. Our findings support the view that learning basic science knowledge is effective in the formation of clinical reasoning skills.

1. Background

Generally, from the 80s onwards, cognitive scientists have believed that the key differentiator between novices and experts is their superior domain-specific knowledge. This knowledge in medicine is divided into two categories. One is basic science knowledge that defines the mechanisms in the human body and the other is clinical knowledge containing various information about the signs and symptoms of the disease and their relations(1). Abraham Flexner was one of the first men who academically described the importance of basic sciences in learning medical knowledge by creating a binary model in the medical education curriculum. He believed that an evidence-based approach to teaching medicine needs teaching basic sciences before clinical training(2, 3).

Clinical reasoning is a process of logical thinking that causes goal-directed diagnostic and therapeutic decisions by physicians and is present in all stages of patient evaluation(46). It is one of the most critical skills that starts to develop early in medical school and continues during his or her practice(7). Clinical reasoning skills is complex and consists of two forms of reasoning: analytical reasoning, which analyses every clinical case into the smaller parts to diagnose and make the decision about the patients, which often is used by novice physicians, and non-analytical reasoning, which the physicians judge the clinical cases as a whole using the patterns and illness scripts that he or she has already experienced in years of practices; this form of clinical reasoning is used by experts in combination with analytical reasoning, especially in complicated clinical cases(7, 8). Today, clinical reasoning tests have been developed to evaluate both analytical and non-analytical reasoning as an alternative assessment(911), so we can evaluate clinical reasoning skills indirectly by writing tests.

The most important concern regarding the basic sciences course is their forgetting over time and their inapplicability for learning clinical knowledge and clinical practice(12). The relation between the knowledge of basic sciences with the clinical reasoning of physicians is a very important issue because since Flexner pointed out the importance of basic sciences in medical practice, insufficient information has been collected in this regard(4).

Although various studies have investigated the impact of basic sciences knowledge on clinical diagnosis for skilled and novice physicians, it is still a matter of controversy why and how clinical diagnosis and clinical reasoning are related to basic sciences knowledge(1316). Some studies believe that basic sciences knowledge directly affects clinical reasoning(17) while some other studies suggest that physicians make little use of their basic sciences knowledge in the clinical diagnosis process(1, 1719). Some other researchers have criticized the previous theories and finally concluded that the knowledge of basic sciences indirectly affects clinical reasoning(2025). Besides, there are not sufficient studies considering the relations of the different basic science disciplines with clinical reasoning and whether or not the knowledge of basic sciences can predict clinical reasoning skills.

The main purpose of the present study is to clarify the relation between learning different basic science disciplines with the clinical reasoning of physicians by assessing the correlation of basic sciences course scores with the clinical reasoning exam score of the medical interns. In addition, we intended to investigate the presence or absence of a predictive factor of clinical reasoning skills in basic sciences course.

2. Methods And Materials

A cross-sectional study was designed including all medical interns of two internal medicine rotations at Isfahan University of Medical Sciences from 2020 to 2021. The protocol of the current study was approved by the ethics committee of Isfahan University of Medical Sciences.

2.1. Sampling and data collection

A total of 130 medical interns participated in this study. The prepared data that were collected consist of participants’ basic sciences course scores (individually and as average course scores) in addition to the participants’ basic sciences national comprehensive exam results. The basic sciences national comprehensive exam consists of 200 questions including basic pathology, anatomy, physiology, immunology, histology, epidemiology, public health, biochemistry, medical nutrition, embryology, parasitology, psychology, genetics, medical physics, mycology, microbiology, virology, Islamic theology. The data of different basic sciences disciplines that were collected and analyzed include histology, biochemistry, anatomy, embryology, physiology, microbiology, immunology, pathology, genetics, and parasitology-mycology. Besides, the clinical reasoning of the participants was measured by a clinical reasoning comprehensive exam.

After data gathering, the basic sciences course scores and basic sciences national comprehensive exam results were evaluated for correlation with the total score of the clinical reasoning exam (as a representative for clinical reasoning skills). In addition, it was considered whether there is a predictive factor in the basic sciences’ data with clinical reasoning skills or not.

2.2. Measure of clinical reasoning skills

Despite the common evaluations in medical education, clinical reasoning exams were designed to evaluate medical practice skills. Therefore, these tests are used to evaluate medical graduates in many parts of the world(911). These tests must have two important characteristics including “process-oriented” and “expert-novice discrimination”(26). Today, we use the “multiple-instrument for multiple role approach” (2628) to measure clinical reasoning skills due to the multifaceted nature of clinical reasoning. This approach is designed based on a model of clinical reasoning which divides clinical reasoning into three major components including information gathering, hypothesis formation, and hypothesis evaluation(26). So, the “key features Test (KF)”, “script concordance test (SC),” and “comprehensive integrative puzzle test (CIP)” were applied to a multidimensional assessment of clinical reasoning skills as a clinical reasoning exam. Each test was designed to evaluate different components of reasoning including: KF for information gathering(11) and SC and CIP for hypothesis evaluation and formation(9, 29).

The clinical reasoning exam was used as a part of the students' final exam after the approval of the internal medicine group's educational council. The theme of the questions was common diseases of internal medicine and the goal was to evaluate diagnostic (not therapeutic) reasoning. The test included 10 questions for each question type (30 questions totally) with different proper times to answer (40 min for KF, 30 min for SC, 50 min for CIP) designed by expert panels of internists and cardiologists. In order to prevent participants from exhaustion and decreasing the reliability of the tests, these tests were held at 3 different times for each type of questions.

2.3. Scoring system

KF and SCT scoring systems are based on the expert panels’ opinion frequency on each option in which each option weighs according to the extent of agreement on the selected item between expert panels. In the KF test, each question contained 12 options in which the participants had to choose 4 of them and the total score of each question was calculated by the total score of the options’ selected for that question, which was out of 1 in the present study. In the SC test, the options were a 5-point Likert type scale that responders specify their level of agreement to a statement typically in five points: (-2) the hypothesis is almost eliminated; (-1) the hypothesis becomes less probable; (0) the information has no effect on the hypothesis; (+ 1) the hypothesis becoming less probable; (+ 2) it can only be this hypothesis. The total score of the SC test was calculated by the sum of the scores of each question, which was out of 1 in the present study.

CIP test score system was not weighted and the expert panels made a common agreement on the answers. The complete answer in the CIP test was a combination of correct matching between each component of patient history, physical examination, paraclinical data, and clinical judgment. The total score of the CIP test was calculated by the sum of the scores of each question, which was out of 2 in the present study.

Therefore, the total score of the clinical reasoning exam was calculated by the sum of the all-tests scores (KF, SC, CIP) out of 4.

2.4. Statistics

Statistical analysis was performed using an independent t-test, stepwise regression, and Pearson correlation coefficient. The software used was Statistical Package for Social Sciences (SPSS), version 21 and the level of statistical significance was defined as p-value < 0.05.

3. Results

After the participation of 130 medical interns in the study, eighteen participants were excluded due to the missing data on their basic sciences course scores, and a total of 112 participants’ data (54 males, 48.2%; 58 females, 51.8%) were analyzed after data collection. Almost all of them were in the same age group. The mean and standard deviation of Basic Science specific course scores, Average Score of Basic Sciences Course, Basic Sciences National Comprehensive Exam results, and Total Score of Clinical Reasoning Exam are shown in Table 1.

 
Table 1

The mean and standard deviation of study variables

 

Gender

N

Mean

Std. Deviation

Histology

male

54

15.74

2.28

female

58

15.96

2.23

Biochemistry

male

54

14.28

2.18

female

58

14.32

2.08

Anatomy

male

54

16.17

2.26

female

58

16.21

2.04

Embryology

male

53

14.68

2.28

female

58

14.79

2.42

Physiology

male

54

15.59

1.64

female

58

15.77

1.52

Microbiology

male

54

15.21

1.79

female

58

15.68

1.55

Immunology

male

54

15.39

2.21

female

58

15.79

1.87

Pathology

male

54

15.42

1.66

female

58

15.86

1.56

Genetics

male

54

14.11

2.16

female

58

14.55

1.65

Parasitology and mycology

male

54

15.32

2.03

female

58

16.25

2.00

Average Score of

male

54

16.25

1.43

Basic Sciences Course

(Out of 20)

female

58

16.52

1.17

Average Score of

male

54

125.94

17.03

Basic Sciences National Comprehensive Exam

(Out of 200)

female

58

134.62

19.17

Total Score of

male

54

2.10

.30

Clinical Reasoning Exam

(Out of 4)

female

58

2.15

.36

Note. The scores of all basic sciences course are out of 20

As shown in Table 2, the variables correlate positively significant with the total score of the clinical reasoning exam. The highest correlation was between the total score of the clinical reasoning exam with the average score of the basic sciences course and the lowest correlation was between the total score of the clinical reasoning exam and with embryology course. Although there was no large correlation between the clinical reasoning exam and none of the scores Because r value: 0.1 to 0.5 means small to medium positive correlation.

 
Table 2

Pearson correlation between Total score of clinical reasoning exam with basic sciences data

 

Histology

Biochemistry

Anatomy

Embryology

Physiology

Microbiology

Immunology

Pathology

Genetics

Parasitology and mycology

Average Score of

Basic Sciences Course

Average Score of

Comprehensive exam

Total Score of

Clinical Reasoning Exam

Pearson Correlation

(R – Value)

.35*

.33*

.28*

.19*

.31*

.26*

.22*

.23*

.29*

.25*

.37*

.32*

Sig. (2-tailed)

(P – Value)

.00

.00

.00

.03

.00

.00

.01

.01

.00

.00

.00

.00

*. Correlation is significant at the 0.05 level.

According to the “stepwise-regression” analysis, the only predictive factor of clinical reasoning skills was the average score of the basic sciences course. This regression model explains 13.5% of the variance of the total score of clinical reasoning exam. Note that the regression model was statistically significant (Table 3).

 
Table 3

Stepwise regression analysis

Model

Unstandardized Coefficients

Standardized Coefficients

t

Sig.

R Square

Adjusted R Square

B

Std. Error

Beta

1

(Constant)

.56

.38

 

1.47

.14

.13

.12

Average Score of

Basic Sciences Course

.09

.02

.36

4.12

.00

F = 17.01, P-Value < 0.01

It should be noted that there were no significant gender differences between scores of clinical reasoning skills and their correlation with basic sciences course scores.

(Please Put the Table 1, Table 2, Table 3 here in order- Tables have been uploaded in Table section)

4. Discussion

The main findings of the present study are: (1) there is a positive correlation between basic sciences knowledge and clinical reasoning skills; (2) clinical reasoning skills are most closely correlated with the average score of basic sciences course rather than to each course individually; (3) the only predictive factor for clinical reasoning skills is the average score of basic sciences course. We also found that there was no significant difference between genders in clinical reasoning skills.

The result of the current study was in contrast with Patel and colleagues’ study(30) in which they concluded that medical experts mainly use their clinical knowledge in clinical diagnosis despite their basic sciences knowledge. They believed that basic sciences knowledge and clinical sciences knowledge are two “distinct worlds”. however, Patel and colleagues only evaluated the direct relation between basic sciences knowledge and the clinical reasoning process, these two can correlate with each other indirectly in the organization and interpretation of clinical findings.

This result of our study that basic sciences knowledge is correlated with clinical reasoning skills confirmed previous reports by Schmidt and colleagues(31) as a “knowledge encapsulation theory”. They suggested that basic science knowledge become encapsulated in clinical concepts during the medical practice of experts. Although, Schmidt's encapsulation theory does not clearly describe the cause-and-effect relationship between basic sciences knowledge and clinical knowledge, therefore, the value of this encapsulation is unclear.

One of the criticisms of most of the studies is that they did not examine the effect of the passage of time on the relationship between basic science knowledge and clinical reasoning. This issue was considered in our study. Because there is a gap of at least 4 years between our basic sciences course’s data and clinical reasoning exam. This issue also was considered in Woods and colleagues(32) study in which they concluded that the physicians who understand basic sciences mechanisms are better in retention and retrieval of the clinical diagnostics data overtime(after one week in that study). However, they did not find a significant difference in performance in the immediate diagnostic test.

One of the key points of the current study is to examine the relationship between each of the basic sciences disciplines in addition to average score with clinical reasoning skills. Based on this, the most correlation between clinical reasoning skills and basic sciences course (in order from the highest to the lowest) is as follows: average score of basic science course, histology, biochemistry, the average score of basic science national comprehensive exam, physiology, genetics, anatomy, microbiology, parasitology and mycology, pathology, immunology and embryology. The highest correlation between clinical reasoning skills with an average score of basic sciences course can be explained by the higher number of evaluations involved in the formation of the average score, so more evaluation leads higher correlation; also, the lowest correlation of clinical reasoning skills with embryology course can be explicated by its less application in clinical knowledge. However, none of the scores had a large correlation with clinical reasoning skills. It confirms the reports of de Bruin and colleagues(1) in which they suggested that however basic sciences knowledge correlates with diagnostic reasoning, this is through its relation with clinical knowledge; moreover, basic sciences knowledge plays a less prominent role in diagnostic reasoning.

Why the average score of the basic sciences course is the only predictive factor for clinical reasoning skills can be explained by the cumulative effects of scores which provide us more comprehensive information. In the other words, the accumulation of basic sciences data predicts the relations more accurately.

The study has some potential limitations. The unfamiliarity of students and faculty members with clinical reasoning tests, faced us with many challenges in designing and implementing the tests however, we tried to diminish this negative effect by preparing a proper and detailed test guide. Since the clinical reasoning exam was held as a part of the final exam, we had limitations on the number of questions and a time limit in its implementation, which may affect the validity and reliability of the tests. In addition, for each of the two internal medicine rotations’ students, different basic sciences national comprehensive exams had been held before; therefore, this issue can affect the accuracy of the basic science national comprehensive exam results’ correlation with the clinical reasoning exam. Besides, we had a limitation in enrolling the students in the study due to the long period of internal medicine rotation, So the limited number of students may affect the reliability of the results.

The authors also suggest other researchers to examine this method of study on physicians (instead of students) to differentiate between the experts and the novices.

5. Conclusion

The results of this study suggest that, within the limitations of this study, learning basic sciences knowledge has positive correlations with clinical reasoning skills. However, this correlation is small to moderate based on the course. Also, clinical reasoning skills were more correlated with the average score of the basic sciences course rather than to each course alone. Moreover, the only predictive factor of clinical reasoning skills in basic sciences data is the average score of the basic sciences course. Our findings support the view that learning basic science knowledge is effective in the formation of clinical reasoning skills.

Declarations

Acknowledgments

The authors gratefully acknowledge the ongoing support of the Internal Medicine department at the Isfahan University of Medical Sciences.

Ethics approval and consent to participate

Ethical approval has been granted by the ethics committee of Isfahan University of Medical Sciences for this study, IR.MUI.RESEARCH.REC.1399.114, on May 27, 2020. All methods were carried out under the relevant guidelines and regulations. Verbal and written informed consent were obtained from all participants. Participants were informed about the anonymized use of their data before the start of the study.

Consent for publication

Not applicable.

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available due to compromising the privacy of the participants but are available from the corresponding author on reasonable request.

Competing interests

The authors declare that no competing interests exist.

Funding

The study was supported by the Isfahan University of Medical Sciences (Grant ID: 398927).

Authors' contributions

AO and MRG are responsible for the conception and designing the research study although, all of the authors contributed to the study design. FM and KH were in charge of expert panels for designing the clinical reasoning exam and FM was responsible for conducting the clinical reasoning exam. MRG contributed to data collection and was responsible for the drafting of the primary manuscript and additional revisions. AN was responsible for data analysis and interpretation of data. AO was the coordinator and supervised the implementation of the study. All authors participated in a critical review of the primary manuscript; they have read and approved the final manuscript.

Authors' information

Medical student, Isfahan University of Medical Sciences, Isfahan, Iran

2 Associate professor of nephrology, Isfahan Kidney diseases research center, Isfahan University of Medical Sciences, Isfahan, Iran

3 Assistant professor of cardiology, Heart Failure Research Center, Cardiovascular Research Institute, Isfahan University of Medical Sciences, Isfahan, Iran.

4 Assistant professor, Department of Medical Education, Medical Education Research Center, Education Development Center, Isfahan University of Medical Sciences, Isfahan, Iran.

5 Associate professor, Medical Education Research Center, Department of Medical Education, Isfahan University of Medical Sciences, Isfahan, Iran

References

  1. de Bruin AB, Schmidt HG, Rikers RM. The role of basic science knowledge and clinical knowledge in diagnostic reasoning: a structural equation modeling approach. Acad Med. 2005;80(8):765–73.
  2. Flexner A. Medical education in the United States and Canada. From the Carnegie Foundation for the Advancement of Teaching, Bulletin Number Four, 1910. Bull World Health Organ. 2002;80(7):594–602.
  3. Wimmers P, Mentkowski M. Assessing Competence in Professional Performance across Disciplines and Professions2016.
  4. Woods NN. Science is fundamental: the role of biomedical knowledge in clinical reasoning. Medical education. 2007;41(12):1173–7.
  5. Audetat MC, Laurin S. Supervision of clinical reasoning: methods and a tool to support and promote clinical reasoning. Canadian family physician Medecin de famille canadien. 2010;56(3):e127-9, 294–6.
  6. Elizondo-Omana RE, Morales-Gomez JA, Morquecho-Espinoza O, Hinojosa-Amaya JM, Villarreal-Silva EE, Garcia-Rodriguez Mde L, et al. Teaching skills to promote clinical reasoning in early basic science courses. Anatomical sciences education. 2010;3(5):267–71.
  7. Blunk DI, Tonarelli S, Gardner C, Quest D, Petitt D, Leiner M. Evaluating Medical Students' Clinical Reasoning in Psychiatry Using Clinical and Basic Science Concepts Presented in Session-level Integration Sessions. Med Sci Educ. 2019;29(3):819–24.
  8. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16.
  9. Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The Script Concordance test: a tool to assess the reflective clinician. Teach Learn Med. 2000;12(4):189–95.
  10. Groves M, Scott I, Alexander H. Assessing clinical reasoning: a method to monitor its development in a PBL curriculum. Medical teacher. 2002;24(5):507–15.
  11. Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills. Acad Med. 1995;70(3):194–201.
  12. Emami SM, Rasouli Nejad M, Changiz T, Afshin Nia F, Zolfaghari B, Adibi P. Interns' View About Basic Medical Sciences: Their Knowledge And Attitude To National Comprehensive Exam And Basic Medical Courses In Isfahan University Of Medical Sciences. Iranian Journal of Medical Education. 2000;1(1):21–5.
  13. Bordage G, Zacks R. The structure of medical knowledge in the memories of medical students and general practitioners: categories and prototypes. Medical education. 1984;18(6):406–16.
  14. Johnson PE, Durán AS, Hassebrock F, Moller J, Prietula M, Feltovich PJ, et al. Expertise and error in diagnostic reasoning. Cognitive Science. 1981;5(3):235–83.
  15. Norman GR, Brooks LR, Allen SW. Recall by expert medical practitioners and novices as a record of processing attention. Journal of Experimental Psychology: Learning, Memory, and Cognition. 1989;15(6):1166–74.
  16. Patel VL, Arocha JF, Kaufman DR. Expertise and tacit knowledge in medicine. Tacit knowledge in professional practice: Researcher and practitioner perspectives. Mahwah, NJ, US: Lawrence Erlbaum Associates Publishers; 1999. p. 75–99.
  17. Lesgold A, Rubinson H, Feltovich P, Glaser R, Klopfer D, Wang Y. Expertise in a complex skill: Diagnosing x-ray pictures. The nature of expertise. Hillsdale, NJ, US: Lawrence Erlbaum Associates, Inc; 1988. p. 311–42.
  18. Higgs J, Jones MA. Clinical Reasoning in the Health Professions: Butterworth-Heinemann; 2000.
  19. Patel VL, Evans DA, Groen GJ. Reconciling basic science and clinical reasoning. Teaching and Learning in Medicine. 1989;1(3):116–21.
  20. Donnon T, Violato C. Medical students' clinical reasoning skills as a function of basic science achievement and clinical competency measures: a structural equation model. Acad Med. 2006;81(10 Suppl):S120-3.
  21. Boshuizen HP, Schmidt HG. On the role of biomedical knowledge in clinical reasoning by experts, intermediates and novices. Cognitive Science. 1992;16(2):153–84.
  22. Schmidt HG, Boshuizen HP. On the origin of intermediate effects in clinical case recall. Memory & cognition. 1993;21(3):338–51.
  23. Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: theory and implication. Acad Med. 1990;65(10):611–21.
  24. Krasne S, Stevens C. Does basic science knowledge correlate with clinical reasoning in assessments of first-year medical students? Procedia - Social and Behavioral Sciences. 2010;2:1287–94.
  25. Rikers RM, Loyens S, te Winkel W, Schmidt HG, Sins PH. The role of biomedical knowledge in clinical reasoning: a lexical decision study. Acad Med. 2005;80(10):945–9.
  26. Monajemi A, Adibi P, Soltani Arabshahi K, Arbabi F, Akbari R, Custers E, et al. The battery for assessment of clinical reasoning in the Olympiad for medical sciences students. Iranian Journal of Medical Education. 2011;10(5):1056–67.
  27. Berner ES, Hamilton LA, Jr., Best WR. A new approach to evaluating-problem-solving in medical students. Journal of medical education. 1974;49(7):666–72.
  28. van der Vleuten CP, Newble DI. How can we test clinical reasoning? Lancet (London, England). 1995;345(8956):1032–4.
  29. Ber R. The CIP (comprehensive integrative puzzle) assessment method. Medical teacher. 2003;25(2):171–6.
  30. Patel VL, Groen GJ. Knowledge based solution strategies in medical reasoning. Cognitive Science. 1986;10(1):91–116.
  31. Rikers RMJP, Schmidt HG, Moulaert V. Biomedical knowledge: encapsulated or two worlds apart? Applied Cognitive Psychology. 2005;19(2):223–31.
  32. Woods NN, Brooks LR, Norman GR. The value of basic science in clinical diagnosis: creating coherence among signs and symptoms. Medical education. 2005;39(1):107–12.