DOI: https://doi.org/10.21203/rs.3.rs-1976175/v1
Background
various studies have investigated the impact of basic sciences knowledge on clinical diagnosis for skilled and novice physicians but, it is still a matter of controversy why and how clinical diagnosis and clinical reasoning are related to basic sciences knowledge.
Objective
This study aimed to explore the correlation between basic science knowledge with clinical reasoning skills by evaluating the correlation between the scores of the basic sciences course with the score of the clinical reasoning exam.
Methods
A clinical reasoning exam was used to evaluate the clinical reasoning skills of all medical interns of two internal medicine rotations. Scores were evaluated for correlation with the basic sciences course scores and basic sciences national comprehensive exam results. statistical analysis was performed using an independent t-test, stepwise regression, and Pearson correlation coefficient.
Results
All of the basic sciences course scores and basic sciences national comprehensive exam results correlated positively significant with the total score of the clinical reasoning exam and the score of the clinical reasoning exam were most closely correlated with the average score of the basic sciences course rather than to each course individually. The only predictive factor for clinical reasoning skills is the average score of basic sciences course. No significant gender differences were observed in the data analysis.
Conclusion
The results of this study suggest that learning basic sciences knowledge has positive correlations with clinical reasoning skills. However, this correlation is small to moderate based on the course. Our findings support the view that learning basic science knowledge is effective in the formation of clinical reasoning skills.
Generally, from the 80s onwards, cognitive scientists have believed that the key differentiator between novices and experts is their superior domain-specific knowledge. This knowledge in medicine is divided into two categories. One is basic science knowledge that defines the mechanisms in the human body and the other is clinical knowledge containing various information about the signs and symptoms of the disease and their relations(1). Abraham Flexner was one of the first men who academically described the importance of basic sciences in learning medical knowledge by creating a binary model in the medical education curriculum. He believed that an evidence-based approach to teaching medicine needs teaching basic sciences before clinical training(2, 3).
Clinical reasoning is a process of logical thinking that causes goal-directed diagnostic and therapeutic decisions by physicians and is present in all stages of patient evaluation(4–6). It is one of the most critical skills that starts to develop early in medical school and continues during his or her practice(7). Clinical reasoning skills is complex and consists of two forms of reasoning: analytical reasoning, which analyses every clinical case into the smaller parts to diagnose and make the decision about the patients, which often is used by novice physicians, and non-analytical reasoning, which the physicians judge the clinical cases as a whole using the patterns and illness scripts that he or she has already experienced in years of practices; this form of clinical reasoning is used by experts in combination with analytical reasoning, especially in complicated clinical cases(7, 8). Today, clinical reasoning tests have been developed to evaluate both analytical and non-analytical reasoning as an alternative assessment(9–11), so we can evaluate clinical reasoning skills indirectly by writing tests.
The most important concern regarding the basic sciences course is their forgetting over time and their inapplicability for learning clinical knowledge and clinical practice(12). The relation between the knowledge of basic sciences with the clinical reasoning of physicians is a very important issue because since Flexner pointed out the importance of basic sciences in medical practice, insufficient information has been collected in this regard(4).
Although various studies have investigated the impact of basic sciences knowledge on clinical diagnosis for skilled and novice physicians, it is still a matter of controversy why and how clinical diagnosis and clinical reasoning are related to basic sciences knowledge(13–16). Some studies believe that basic sciences knowledge directly affects clinical reasoning(17) while some other studies suggest that physicians make little use of their basic sciences knowledge in the clinical diagnosis process(1, 17–19). Some other researchers have criticized the previous theories and finally concluded that the knowledge of basic sciences indirectly affects clinical reasoning(20–25). Besides, there are not sufficient studies considering the relations of the different basic science disciplines with clinical reasoning and whether or not the knowledge of basic sciences can predict clinical reasoning skills.
The main purpose of the present study is to clarify the relation between learning different basic science disciplines with the clinical reasoning of physicians by assessing the correlation of basic sciences course scores with the clinical reasoning exam score of the medical interns. In addition, we intended to investigate the presence or absence of a predictive factor of clinical reasoning skills in basic sciences course.
A cross-sectional study was designed including all medical interns of two internal medicine rotations at Isfahan University of Medical Sciences from 2020 to 2021. The protocol of the current study was approved by the ethics committee of Isfahan University of Medical Sciences.
A total of 130 medical interns participated in this study. The prepared data that were collected consist of participants’ basic sciences course scores (individually and as average course scores) in addition to the participants’ basic sciences national comprehensive exam results. The basic sciences national comprehensive exam consists of 200 questions including basic pathology, anatomy, physiology, immunology, histology, epidemiology, public health, biochemistry, medical nutrition, embryology, parasitology, psychology, genetics, medical physics, mycology, microbiology, virology, Islamic theology. The data of different basic sciences disciplines that were collected and analyzed include histology, biochemistry, anatomy, embryology, physiology, microbiology, immunology, pathology, genetics, and parasitology-mycology. Besides, the clinical reasoning of the participants was measured by a clinical reasoning comprehensive exam.
After data gathering, the basic sciences course scores and basic sciences national comprehensive exam results were evaluated for correlation with the total score of the clinical reasoning exam (as a representative for clinical reasoning skills). In addition, it was considered whether there is a predictive factor in the basic sciences’ data with clinical reasoning skills or not.
Despite the common evaluations in medical education, clinical reasoning exams were designed to evaluate medical practice skills. Therefore, these tests are used to evaluate medical graduates in many parts of the world(9–11). These tests must have two important characteristics including “process-oriented” and “expert-novice discrimination”(26). Today, we use the “multiple-instrument for multiple role approach” (26–28) to measure clinical reasoning skills due to the multifaceted nature of clinical reasoning. This approach is designed based on a model of clinical reasoning which divides clinical reasoning into three major components including information gathering, hypothesis formation, and hypothesis evaluation(26). So, the “key features Test (KF)”, “script concordance test (SC),” and “comprehensive integrative puzzle test (CIP)” were applied to a multidimensional assessment of clinical reasoning skills as a clinical reasoning exam. Each test was designed to evaluate different components of reasoning including: KF for information gathering(11) and SC and CIP for hypothesis evaluation and formation(9, 29).
The clinical reasoning exam was used as a part of the students' final exam after the approval of the internal medicine group's educational council. The theme of the questions was common diseases of internal medicine and the goal was to evaluate diagnostic (not therapeutic) reasoning. The test included 10 questions for each question type (30 questions totally) with different proper times to answer (40 min for KF, 30 min for SC, 50 min for CIP) designed by expert panels of internists and cardiologists. In order to prevent participants from exhaustion and decreasing the reliability of the tests, these tests were held at 3 different times for each type of questions.
KF and SCT scoring systems are based on the expert panels’ opinion frequency on each option in which each option weighs according to the extent of agreement on the selected item between expert panels. In the KF test, each question contained 12 options in which the participants had to choose 4 of them and the total score of each question was calculated by the total score of the options’ selected for that question, which was out of 1 in the present study. In the SC test, the options were a 5-point Likert type scale that responders specify their level of agreement to a statement typically in five points: (-2) the hypothesis is almost eliminated; (-1) the hypothesis becomes less probable; (0) the information has no effect on the hypothesis; (+ 1) the hypothesis becoming less probable; (+ 2) it can only be this hypothesis. The total score of the SC test was calculated by the sum of the scores of each question, which was out of 1 in the present study.
CIP test score system was not weighted and the expert panels made a common agreement on the answers. The complete answer in the CIP test was a combination of correct matching between each component of patient history, physical examination, paraclinical data, and clinical judgment. The total score of the CIP test was calculated by the sum of the scores of each question, which was out of 2 in the present study.
Therefore, the total score of the clinical reasoning exam was calculated by the sum of the all-tests scores (KF, SC, CIP) out of 4.
Statistical analysis was performed using an independent t-test, stepwise regression, and Pearson correlation coefficient. The software used was Statistical Package for Social Sciences (SPSS), version 21 and the level of statistical significance was defined as p-value < 0.05.
After the participation of 130 medical interns in the study, eighteen participants were excluded due to the missing data on their basic sciences course scores, and a total of 112 participants’ data (54 males, 48.2%; 58 females, 51.8%) were analyzed after data collection. Almost all of them were in the same age group. The mean and standard deviation of Basic Science specific course scores, Average Score of Basic Sciences Course, Basic Sciences National Comprehensive Exam results, and Total Score of Clinical Reasoning Exam are shown in Table 1.
Gender |
N |
Mean |
Std. Deviation |
|
---|---|---|---|---|
Histology |
male |
54 |
15.74 |
2.28 |
female |
58 |
15.96 |
2.23 |
|
Biochemistry |
male |
54 |
14.28 |
2.18 |
female |
58 |
14.32 |
2.08 |
|
Anatomy |
male |
54 |
16.17 |
2.26 |
female |
58 |
16.21 |
2.04 |
|
Embryology |
male |
53 |
14.68 |
2.28 |
female |
58 |
14.79 |
2.42 |
|
Physiology |
male |
54 |
15.59 |
1.64 |
female |
58 |
15.77 |
1.52 |
|
Microbiology |
male |
54 |
15.21 |
1.79 |
female |
58 |
15.68 |
1.55 |
|
Immunology |
male |
54 |
15.39 |
2.21 |
female |
58 |
15.79 |
1.87 |
|
Pathology |
male |
54 |
15.42 |
1.66 |
female |
58 |
15.86 |
1.56 |
|
Genetics |
male |
54 |
14.11 |
2.16 |
female |
58 |
14.55 |
1.65 |
|
Parasitology and mycology |
male |
54 |
15.32 |
2.03 |
female |
58 |
16.25 |
2.00 |
|
Average Score of |
male |
54 |
16.25 |
1.43 |
Basic Sciences Course (Out of 20) |
female |
58 |
16.52 |
1.17 |
Average Score of |
male |
54 |
125.94 |
17.03 |
Basic Sciences National Comprehensive Exam (Out of 200) |
female |
58 |
134.62 |
19.17 |
Total Score of |
male |
54 |
2.10 |
.30 |
Clinical Reasoning Exam (Out of 4) |
female |
58 |
2.15 |
.36 |
Note. The scores of all basic sciences course are out of 20 |
As shown in Table 2, the variables correlate positively significant with the total score of the clinical reasoning exam. The highest correlation was between the total score of the clinical reasoning exam with the average score of the basic sciences course and the lowest correlation was between the total score of the clinical reasoning exam and with embryology course. Although there was no large correlation between the clinical reasoning exam and none of the scores Because r value: 0.1 to 0.5 means small to medium positive correlation.
Histology |
Biochemistry |
Anatomy |
Embryology |
Physiology |
Microbiology |
Immunology |
Pathology |
Genetics |
Parasitology and mycology |
Average Score of Basic Sciences Course |
Average Score of Comprehensive exam |
||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Total Score of Clinical Reasoning Exam |
Pearson Correlation (R – Value) |
.35* |
.33* |
.28* |
.19* |
.31* |
.26* |
.22* |
.23* |
.29* |
.25* |
.37* |
.32* |
Sig. (2-tailed) (P – Value) |
.00 |
.00 |
.00 |
.03 |
.00 |
.00 |
.01 |
.01 |
.00 |
.00 |
.00 |
.00 |
|
*. Correlation is significant at the 0.05 level. |
According to the “stepwise-regression” analysis, the only predictive factor of clinical reasoning skills was the average score of the basic sciences course. This regression model explains 13.5% of the variance of the total score of clinical reasoning exam. Note that the regression model was statistically significant (Table 3).
Model |
Unstandardized Coefficients |
Standardized Coefficients |
t |
Sig. |
R Square |
Adjusted R Square |
||
B |
Std. Error |
Beta |
||||||
1 |
(Constant) |
.56 |
.38 |
1.47 |
.14 |
.13 |
.12 |
|
Average Score of Basic Sciences Course |
.09 |
.02 |
.36 |
4.12 |
.00 |
|||
F = 17.01, P-Value < 0.01 |
It should be noted that there were no significant gender differences between scores of clinical reasoning skills and their correlation with basic sciences course scores.
(Please Put the Table 1, Table 2, Table 3 here in order- Tables have been uploaded in Table section)
The main findings of the present study are: (1) there is a positive correlation between basic sciences knowledge and clinical reasoning skills; (2) clinical reasoning skills are most closely correlated with the average score of basic sciences course rather than to each course individually; (3) the only predictive factor for clinical reasoning skills is the average score of basic sciences course. We also found that there was no significant difference between genders in clinical reasoning skills.
The result of the current study was in contrast with Patel and colleagues’ study(30) in which they concluded that medical experts mainly use their clinical knowledge in clinical diagnosis despite their basic sciences knowledge. They believed that basic sciences knowledge and clinical sciences knowledge are two “distinct worlds”. however, Patel and colleagues only evaluated the direct relation between basic sciences knowledge and the clinical reasoning process, these two can correlate with each other indirectly in the organization and interpretation of clinical findings.
This result of our study that basic sciences knowledge is correlated with clinical reasoning skills confirmed previous reports by Schmidt and colleagues(31) as a “knowledge encapsulation theory”. They suggested that basic science knowledge become encapsulated in clinical concepts during the medical practice of experts. Although, Schmidt's encapsulation theory does not clearly describe the cause-and-effect relationship between basic sciences knowledge and clinical knowledge, therefore, the value of this encapsulation is unclear.
One of the criticisms of most of the studies is that they did not examine the effect of the passage of time on the relationship between basic science knowledge and clinical reasoning. This issue was considered in our study. Because there is a gap of at least 4 years between our basic sciences course’s data and clinical reasoning exam. This issue also was considered in Woods and colleagues(32) study in which they concluded that the physicians who understand basic sciences mechanisms are better in retention and retrieval of the clinical diagnostics data overtime(after one week in that study). However, they did not find a significant difference in performance in the immediate diagnostic test.
One of the key points of the current study is to examine the relationship between each of the basic sciences disciplines in addition to average score with clinical reasoning skills. Based on this, the most correlation between clinical reasoning skills and basic sciences course (in order from the highest to the lowest) is as follows: average score of basic science course, histology, biochemistry, the average score of basic science national comprehensive exam, physiology, genetics, anatomy, microbiology, parasitology and mycology, pathology, immunology and embryology. The highest correlation between clinical reasoning skills with an average score of basic sciences course can be explained by the higher number of evaluations involved in the formation of the average score, so more evaluation leads higher correlation; also, the lowest correlation of clinical reasoning skills with embryology course can be explicated by its less application in clinical knowledge. However, none of the scores had a large correlation with clinical reasoning skills. It confirms the reports of de Bruin and colleagues(1) in which they suggested that however basic sciences knowledge correlates with diagnostic reasoning, this is through its relation with clinical knowledge; moreover, basic sciences knowledge plays a less prominent role in diagnostic reasoning.
Why the average score of the basic sciences course is the only predictive factor for clinical reasoning skills can be explained by the cumulative effects of scores which provide us more comprehensive information. In the other words, the accumulation of basic sciences data predicts the relations more accurately.
The study has some potential limitations. The unfamiliarity of students and faculty members with clinical reasoning tests, faced us with many challenges in designing and implementing the tests however, we tried to diminish this negative effect by preparing a proper and detailed test guide. Since the clinical reasoning exam was held as a part of the final exam, we had limitations on the number of questions and a time limit in its implementation, which may affect the validity and reliability of the tests. In addition, for each of the two internal medicine rotations’ students, different basic sciences national comprehensive exams had been held before; therefore, this issue can affect the accuracy of the basic science national comprehensive exam results’ correlation with the clinical reasoning exam. Besides, we had a limitation in enrolling the students in the study due to the long period of internal medicine rotation, So the limited number of students may affect the reliability of the results.
The authors also suggest other researchers to examine this method of study on physicians (instead of students) to differentiate between the experts and the novices.
The results of this study suggest that, within the limitations of this study, learning basic sciences knowledge has positive correlations with clinical reasoning skills. However, this correlation is small to moderate based on the course. Also, clinical reasoning skills were more correlated with the average score of the basic sciences course rather than to each course alone. Moreover, the only predictive factor of clinical reasoning skills in basic sciences data is the average score of the basic sciences course. Our findings support the view that learning basic science knowledge is effective in the formation of clinical reasoning skills.
Acknowledgments
The authors gratefully acknowledge the ongoing support of the Internal Medicine department at the Isfahan University of Medical Sciences.
Ethics approval and consent to participate
Ethical approval has been granted by the ethics committee of Isfahan University of Medical Sciences for this study, IR.MUI.RESEARCH.REC.1399.114, on May 27, 2020. All methods were carried out under the relevant guidelines and regulations. Verbal and written informed consent were obtained from all participants. Participants were informed about the anonymized use of their data before the start of the study.
Consent for publication
Not applicable.
Availability of data and materials
The datasets generated and/or analyzed during the current study are not publicly available due to compromising the privacy of the participants but are available from the corresponding author on reasonable request.
Competing interests
The authors declare that no competing interests exist.
Funding
The study was supported by the Isfahan University of Medical Sciences (Grant ID: 398927).
Authors' contributions
AO and MRG are responsible for the conception and designing the research study although, all of the authors contributed to the study design. FM and KH were in charge of expert panels for designing the clinical reasoning exam and FM was responsible for conducting the clinical reasoning exam. MRG contributed to data collection and was responsible for the drafting of the primary manuscript and additional revisions. AN was responsible for data analysis and interpretation of data. AO was the coordinator and supervised the implementation of the study. All authors participated in a critical review of the primary manuscript; they have read and approved the final manuscript.
Authors' information
1 Medical student, Isfahan University of Medical Sciences, Isfahan, Iran
2 Associate professor of nephrology, Isfahan Kidney diseases research center, Isfahan University of Medical Sciences, Isfahan, Iran
3 Assistant professor of cardiology, Heart Failure Research Center, Cardiovascular Research Institute, Isfahan University of Medical Sciences, Isfahan, Iran.
4 Assistant professor, Department of Medical Education, Medical Education Research Center, Education Development Center, Isfahan University of Medical Sciences, Isfahan, Iran.
5 Associate professor, Medical Education Research Center, Department of Medical Education, Isfahan University of Medical Sciences, Isfahan, Iran