A systematic review to investigate student performance and their learning experiences within integrated healthcare curricula

Background Integrated curricula is being adopted within healthcare programmes with a growing number of curriculum evaluations being undertaken and reported. A framework exists to guide educationalists in the planning, design and implementation of integrated curricula. It is unknown how widely utilised this framework has been to inform curricula development. This study presents a systematic appraisal of the evidence on how healthcare students experience and perform within integrated interventions and aims to assess how the context of the curriculum has influenced these outcomes. Methods Six electronic databases: Medline, Embase, Scopus, Psych-INFO, CINHAL and ProQuest were systematically searched in September 2018. Studies reporting on undergraduate healthcare students providing feedback, or performing at terminal assessment after experiencing integrated curricula were included. The assessments were categorised by the learning domain they aimed to test. Studies were assessed for methodological quality and risk of bias using a critical appraisal checklist. Studies were appraised against the implementation framework to facilitate contextual understanding of the intervention and findings. Results Forty studies from programmes in medicine, pharmacy and dentistry met the inclusion criteria and were included. Interdisciplinary level integration was most widely adopted, with a wide range of teaching and learning strategies employed in the delivery. Assessments testing higher learning domains, e.g. application and analysis of knowledge, were more commonly reported. Students appear to perform similarly or conservatively better after experiencing integrated education, however adopted study designs preclude the deduction of a direct causal relationship. Students report generally positive feedback on their integrated experiences, claiming the development of a wide range of skills. However, authors provide insufficient detail about the integrated educational developments to best inform future educationalists on the best systems for


Abstract
Background Integrated curricula is being adopted within healthcare programmes with a growing number of curriculum evaluations being undertaken and reported. A framework exists to guide educationalists in the planning, design and implementation of integrated curricula. It is unknown how widely utilised this framework has been to inform curricula development. This study presents a systematic appraisal of the evidence on how healthcare students experience and perform within integrated interventions and aims to assess how the context of the curriculum has influenced these outcomes. Methods Six electronic databases: Medline, Embase, Scopus, Psych-INFO, CINHAL and ProQuest were systematically searched in September 2018. Studies reporting on undergraduate healthcare students providing feedback, or performing at terminal assessment after experiencing integrated curricula were included. The assessments were categorised by the learning domain they aimed to test. Studies were assessed for methodological quality and risk of bias using a critical appraisal checklist. Studies were appraised against the implementation framework to facilitate contextual understanding of the intervention and findings. Results Forty studies from programmes in medicine, pharmacy and dentistry met the inclusion criteria and were included. Interdisciplinary level integration was most widely adopted, with a wide range of teaching and learning strategies employed in the delivery. Assessments testing higher learning domains, e.g. application and analysis of knowledge, were more commonly reported. Students appear to perform similarly or conservatively better after experiencing integrated education, however adopted study designs preclude the deduction of a direct causal relationship. Students report generally positive feedback on their integrated experiences, claiming the development of a wide range of skills. However, authors provide insufficient detail about the integrated educational developments to best inform future educationalists on the best systems for curriculum integration. Conclusions There is an impetus in research purporting best practices in curriculum integration, however, more standardised, evidence-informed design and reporting of interventions and outcomes are required to strengthen evidence in this area.

Background
Educationalists involved in undergraduate healthcare programmes employ a range of pedagogical strategies to best prepare students for vocational practice, where they will be required to utilise metacognitive strategies and apply scientific and clinical knowledge to provide appropriate patient care. Curriculum integration (CI) is one strategy that has been suggested to support this [1].
Harden's integration ladder illustrates a theoretical framework for CI. This framework outlines the spatial, temporal and contextual arrangement of discipline-specific to nondiscipline specific teaching [2].Theories of learning suggest that integrated education may be beneficial for learning and retention to facilitate contextual and applied learning, and can promote the development of well-organised knowledge structures that underlie effective clinical reasoning [3,4].
Goldman and Schroth present a comprehensive framework to guide and inform the conceptualisation, design, and development of integrated curricula. This methodology is recommended to ensure that the implementation of integrated curricula is recognised as a strategy for successful curricular development, rather than a goal in itself [5]. It is not clear how widespread the adoption of such an approach has been, and whether CI, as a developmental strategy, has helped educators achieve their initial purpose for the outcomes of their learners. Similarly, there is little empirical evidence to demonstrate that integrated learning, leads to more capable healthcare professionals. Harden's continuum of CI highlights the level of subjectivity and diversity involved [2]. This means that without systematic analysis and synthesis of individual studies evaluating integrated interventions, studies are limited to individual reports of best practices, meaning that a holistic approach, focusing on best systems is forgone.
This systematic review aims to appraise the evidence of these individual integrated educational activities within healthcare programmes in relation to the impact on the students' learning experience and performance.

Methods
The work was planned, conducted and reported in adherence to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [6] standards for quality for reporting systematic reviews. All steps were undertaken independently by two authors and adjudicated by a third where required.

Study identification
A modified version of the SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) framework proposed by Booth [7], focused the literature search to ensure relevance and accuracy. The authors excluded the comparison element, since this could significantly limit the literature to be included. Search terms are detailed in Table 1.

Study eligibility
The PICO tool was used to frame the eligibility criteria. Studies written in English, which investigated how undergraduate, healthcare students (Population), experiencing teaching and learning activities that integrated across disciplines (Intervention), learnt, perceived or performed (Outcomes) as a consequence were included. Studies did not have to include a control or comparator group in order to be included. The studies focused on: curricula described as integrating horizontally or vertically; spiral curricula; curricula that aimed to integrate discipline-specific knowledge; investigating how students become 'integrative thinkers', and curricula that included problem-based or team-based learning.
Studies focusing on modular, discipline-specific or segregated curricula; the development of curricula with no reference to student experience; graduate employment were excluded.

Data extraction and synthesis
Data extraction included: the study details; study settings, study design, details of the student knowledge and/or skill assessment and comments on the findings reported, categorised by student performance and student feedback. This information has been included within Additional File 1.
The framework for rational application of integration, as conceptualised by Goldman and Schroth [5], was used to assess the curriculum development decision, i.e. programme (entire educational provision), course (discrete units of study forming a component part of a programme) or session (individual teaching and learning sessions) level integration; and integration decisions, which included: purpose for integration; dimensions of integration (horizontal/vertical); teaching environments, and the underlying principles of integration.
The level of integration was assessed from the curricular descriptions (where provided), and categorised according to Harden's ladder [2]. The level of assessment(s) (where available), were interpreted or abstracted from the educational descriptions and categorised according to Bloom's taxonomy [8]. This data is presented in Additional File 2.

Quality assessment
Included studies were appraised using the JBI (Joanna Briggs Institute) Critical Appraisal of Evidence Effectiveness Tool [9]and the results included in Additional File 3. Specifically, the tools for case control; quasi-experimental; cross-sectional; qualitative and randomised control trial studies were used.

Results
A total of 40,526 articles were identified from initial searches. Through screening and application of the eligibility criteria, as shown in the PRISMA diagram ( Figure 1), 40 studies were included in the review.
Determining the dimension of integration was not always clear from the descriptions provided, but in most cases, it was discernible that many of the strategies were designed horizontally across a semester or a single academic year. In twelve cases, the integrated curricula were described to indicate horizontal and vertical integration [12,22,[34][35][36]39,40,[43][44][45]47,49].
A range of teaching strategies have been adopted to deliver the integrated interventions.

Outcomes Performance
Thirty-three studies [10,[14][15][16][17][18]21,[23][24][25][26][28][29][30][31][32][33]35,38,[40][41][42][43][44][45] reported on aspects of student performance after experiencing the integrated educational activity. Nine studies compared performance of students experiencing the integrated curriculum to students in previous cohorts experiencing a less integrated programme [10,17,21,28,29,31,35,43,45]. Three of these studies reports an improvement in more A and B grades [18], another two had a higher percentage of A grades [29,31], and two report higher mean averages [10,30], and lower failure rates [21]. Three further studies report that no significant difference in performance was found amongst students experiencing the traditional, less integrated curriculum compared to the newer, integrated activities [35,43,45]. Two studies were able to investigate the potential impact on performance of a group of students who experienced the integrated curriculum compared to a control group [18,33]. Surapeneni, and Gaddam et al, demonstrated interdisciplinary learning achieved significantly higher performance across five evaluation assessments [18] and in two examinations [33] respectively. Other studies evidence that, following an integrated curriculum, students demonstrate; improved knowledge [23,40,41] competency [24], and better clinical reasoning [25]. However, there is no comparator group in these cases. Marshall et al. detail that the students did not score significantly differently on multiple-choice questions pertaining to material covered in the integrated module compared to the same material, taught in a non-integrated manner [27]. Three studies investigated the performance at national professional exams (USA) of graduates from an integrated curriculum compared to the national average. Students in two of these studies demonstrated statistically significantly better performance than the national average [26,30], and one study showed that performance of graduates from an integrated programme was similar to the national average [32]. Two studies conducted in the Netherlands found that students who had experienced an integrated programme were more likely to have made a definitive career choice before graduating medical school [48,49]. More students were also invited to fill specific residency positions in comparison to non-integrated graduates [48], and these students required less time to achieve acceptance onto a residency programme, submitting fewer applications for such posts [49].
Studies also provide student opinion of improved preparation for practice [15,24,25,39,48,49]. Further benefits captured include improved self-directed learning [13,47], deeper approach to learning [16], and improved motivation [17,36]. Three studies found students receiving integrated curricula feeling more prepared for university and external exams [20,25,26], however, for other studies, students were undecided [28], or felt that that the integrated approach did not help prepare for those assessments [19].Challenges such as; the need for more careful oversight from faculty staff for consistency and curricular alignment [11]; support to introduce students to core knowledge before integrating the curriculum [11]; clarity in depth of knowledge outcomes for students were reported by students [10,11,22]. Some students also articulated the key role of the academic to facilitate the consolidation and appropriate application of knowledge [11,18].
Few studies aimed to explore the in-depth experience of students learning within an integrated curriculum. Laksov et al report that students perceived integration as the creation of wholeness, relating new knowledge to core concepts and, a collaboration between teachers [12]. Medical students reported that applied exercises during the programme are required to facilitate knowledge retention [36]. Coleho et al evidence that students require time to build an understanding of integrated curriculum and can experience confusion before they gain clarity [22].

Study bias assessment
Across the cross-sectional studies [10,[13][14][15][16]19,20,[22][23][24][25][26][27][28][29]31,32,[36][37][38][41][42][43]46], confounding factors were not taken into account and therefore their influence was not controlled when measuring outcomes. In the qualitative studies [11,12,36,37], the most common deficiency was that the researcher failed to report or account for their theoretical position in relation to the research conducted. Studies that adopted a quasi-experimental approach [17,21,30,34,35,39,40,44,45,[47][48][49] were mixed with regards to risk of bias, and the key problematic issues related to the lack of a control group, lack of multiple outcome measures pre-and post-intervention and little, if any, detail on completion and differences in follow-up. The one randomised control trial [18] had a high risk of bias due to the lack of information provided around sample randomisation, blinding and follow-up. Lastly, in the one case control study [33], there was no matching of the cases with controls, and no confounding factors identified or managed.

Discussion
This review analyses the current literature about how students experience and perform within integrated curricula. The included studies demonstrate the diversified approach to the design and delivery of an integrated curriculum. Programme and course-level integrated strategies are clearly popular, and the interdisciplinary approach [2] appears to be the most widely adopted. A wide range of active, student-centred, teaching and learning approaches were described to facilitate the delivery of the integrated educational activities, highlighting the low dependence on didactic lectures as a lone teaching strategy. Mennin describe how teaching methods, demanding higher levels of interactivity based on constructivist and socio-constructivist learning theories, are more likely to disturb the cognitive status quo, stimulate self-organisation (integration) and the emergence of learning [50]. Kulasegaram et al also recommend learner-centricity rather than a programme design focus towards the development of cognitive integration of knowledge [51].
Unfortunately, there was insufficient reporting of the educational interventions to discern the main aims and objectives driving adoption of CI and the dimensionality (horizontal, vertical, etc). This reduces the potential to determine if the intended student outcomes are achieved with the redeveloped teaching and learning provision, and limits the usability of the findings to inform future educationalists with curriculum design and development interests. When included, the educationalists describe their main motivation being to augment a student-centred learning environment, conducive to skill and knowledge development, specifically across disciplines. This relates more closely to the learning environment perspective of integration rather than the cognitive perspective.
This former view of integration is described where the learning environment is the focus to facilitate social, collaborative and cognitive dimensions of student learning [52].
There was also an absence of specific knowledge-based outcomes articulated as goals for curriculum development. This is facilitative in integrative learning, since more general, ability-based outcomes transcend disciplinary boundaries, and emphasise the importance of abilities to synthesise, apply and communicate knowledge in complex, real-world circumstances [53]. Pearson et al describe the differing conceptions of curriculum as the espoused (intention of programme planners), the enacted (the learning and assessment activities implemented by instructors) and the experienced (how the educational activities are taken up by students) [53]. In the recording and reporting of educational interventions, potentially using the framework from Goldman and Schroth [5], details of these concepts would add significant contextual value, to understand the potential convergence and divergence between them and the subsequent impact of the intervention.
Similarly, not all studies included details of the assessments associated with the curricula.
It is well acknowledged that assessment drives student cognitive development and motivates learning [54]. Moreover, curriculum evaluation studies have demonstrated that content and assessment, rather than the teaching strategies employed, are the deciding factors for successful integration [55,57]. Malik and Malik recommend the co-designing of assessment alongside the integrated teaching and learning to ensure most successful implementation and impact on student learning [57]. This emphasises the contextual significance of reporting the forms of assessment, if any investigation of student performance and/or feedback is intended. Where this detail was included in the reviewed studies, assessments appeared to assess students' ability to apply knowledge, at the very least. Some studies reported assessments probing students' capacity for higher-order thinking, however, more ubiquitous inclusion is required across pedagogic research to understand further the correlation to student experience and performance. It is recommended that assessment should measure at the levels of synthesis and evaluation if higher levels of integrative thinking is the intended student outcome [57]. Kulasegaram et al further details this by stressing the importance of sophisticated outcome measures to assess students' ability to apply, explain and evaluate conceptual problems rather than simply the recalling of facts [58]. A recent guide offers some suggestions for appropriate forms of assessing integration, which are supported with empirical evidence [1].
The systematic assimilation of student outcomes herein, demonstrates that there is some evidence to suggest achievement of similar or improved student performance at university and nationally set assessments after experiencing integrated curricula, when compared to more disintegrated teaching and learning. However, there is generally a lack of controlled studies that have verified causal relationships between the educational exposure and academic performance. Most studies compared performance to historical cohorts, or to performance in assessments associated with non-integrated aspects of the programme.
Otherwise, the integrated learning activities have been shown to facilitate the development of a range of cognitive skills and abilities. These findings concur with those reported by Brauer & Ferguson, where non-inferior, or some objective benefits have been found [1]. Husband et al describe that well-designed, longitudinal studies including graduates' early years in practice will allow assessment of changes in knowledge and competence, and potential impact on practice and patient outcomes [59].
The students across the reviewed studies are generally favourable of the learning experiences and acknowledge their improvement in a further range of skills and competencies. Similarly, students report preparedness for practice and development of required behaviours of a life-long learner. The qualitative studies provide some insight into how students perceive and experience integrated curricula which are supported by the well-acknowledged learning theories of Vygotsky and Piaget [3,60]. This review is strengthened by the systematic approach adopted in assessing the detail of the educational intervention and nature of assessment and measurement of outcomes. It is, however, limited by exactly the quality and depth of this information provided in many cases. In order for such pedagogical research to contribute empirically to the evidence base for integrated education, and meaningfully inform future educationalists, interventions need to be designed, evaluated and reported with this holistic aim in mind.

Conclusions
This review interrogates the plethora of research investigating the impact of integrated curricula upon students within healthcare programmes. Student reception of integrated provision was generally positive, with a reported array of developed knowledge and skills.
However, evidence for better student performance at terminal assessment is limited by the nature of the adopted evaluation design that is vulnerable to bias and lack of depth provided. More thorough, theoretically-framed educational design, development and evaluation is necessary to ameliorate across the singularity in approach widely portrayed within the scholastic literature.

Competing interests
None to report.

Funding
No funding was obtained for this study.

Availability of data and materials
Data sharing is not applicable to this article as no datasets were generated or analysed during the current study. All databases used in this study are open to the public.

Authors' contributions
All authors were involved in the conception of the research question. AK and HN undertook the systematic search and with ZN, undertook the data abstraction. Quality assessment was undertaken by HN and ZN. All authors contributed to the analysis of the results, and preparation of the written manuscript.
Ethics approval and consent to participate Not applicable.

Consent for publication
Not applicable.