We will follow the methodological framework for scoping reviews originally described by Arksey and O’Malley [32], and later advanced by Levac et al. [33] and by Khalil et al. [34]. This framework consists of the following five stages: 1) identifying the research question by clarifying and linking the purpose and research question, 2) identifying the relevant studies using a three-step literature search in order to balance the breadth and comprehensiveness, 3) careful selection of the studies using a team approach, 4) charting the data in a tabular and narrative format, and 5) collating the results to identify the implications of the study findings for policy, practice or research [34].
We drafted the protocol using the Preferred Reporting Items for Systematic Reviews and Meta-analysis Protocols (PRISMA-P checklist, Additional file 1) [35]. For the scoping review we plan to follow the newly developed reporting guidelines for scoping reviews: the PRISMA Extension for Scoping Reviews (PRISMA-ScR) [36]. We have registered our protocol in Open Science Framework. Due to the iterative nature of scoping review methodology changes to the protocol can occur. We will report any changes to the protocol.
Eligibility and exclusion criteria
We will include research publications that involve radiography students, faculty in radiography education and/or clinicians, and publications that describe and/or evaluate any type of simulation intervention within any type of professional radiography specialization. All empirical and theoretical/conceptual peer-reviewed publications and grey literature that focus on simulation in radiography education will be considered for inclusion. We will exclude publications with non-research study designs (e.g. editorial, discussion/opinion papers, guidelines, letters and non-systematic reviews). All empirical and conceptual publications must have an abstract and aim clearly stated. No language or year restrictions will be applied, and we will not apply any restrictions regarding status of publication. In line with the Joanna Briggs Institute Reviewer's Manual [37] detailed inclusion criteria of this scoping review are specified as the Population, Concept, Context and Types of sources of evidence (Table 1).
Table 1. Study eligibility
|
Inclusion criteria
|
Population
|
· Radiography students, both undergraduate and postgraduate
· Faculty in radiography education
· Radiography clinicians and clinical supervisors/clinical educators/instructors
|
Concept
|
· All types of simulation (e.g. integrated simulators, simulated patients, simulated environments, virtual reality and haptic systems, computer-based systems, part-task trainers)
· All types of learning design/pedagogical methods
· All types of learning outcomes (e.g. knowledge, skills, competence, generic skills, attitudes, self-efficacy)
· All types of patient outcomes
|
Context
|
· Institutions educating radiographers (higher education institutions/universities, simulation labs or centers, hospitals)
· Different professional radiography specializations: medical radiation sciences including the disciplines of radiography; digital/conventional radiography, interventional radiography, computed tomography, magnetic resonance imaging, radiation therapy, medical dosimetry, mammography, sonography/ultrasound, nuclear medicine
|
Types of sources of evidence
|
All empirical and theoretical/conceptual peer-reviewed publications and grey literature that focus on simulation in radiography education
|
Search strategy and information sources
We will search MEDLINE, Embase, Epistemonikos, The Cochrane Library, ERIC and Scopus. To identify grey literature, we will search OpenGrey and Google Scholar. We will search the reference lists and citations of included studies to identify additional, relevant references. The searches will be re-run just before the final analyses to retrieve further studies for inclusion.
We developed a comprehensive search strategy for Ovid MEDLINE in collaboration with an experienced research librarian. An example of a full electronic search using search terms for simulation in radiography education in the Ovid MEDLINE yielded 1641 articles on January 9, 2020 (Additional file 2). We will adapt/use the search strategy used for the Ovid MEDLINE to each database. As the search strategy for scoping reviews is considered an iterative process [32-34], we will evaluate the initial search results and evaluate needs for improvement during the review process. Records will be exported to EndNote X9 [38]. to enable data management, removal of duplicates, and retrieving full texts.
Study selection
For the selection of eligible studies we will use the Rayyan screening tool [39]. Based on the inclusion criteria (Table 1), two review authors will independently screen titles and abstracts from the retrieved studies. Two review authors will then independently assess the acquired full text publications for eligibility. Any disagreements regarding eligibility will be resolved by discussion among the two review authors, and a third reviewer will resolve disagreement if needed. If full text articles are excluded, the reasons will be presented in an appendix. To ensure consistency and reliability in the study selection process, we will pilot the study selection using a random selection of 50 titles and abstracts from the literature search before we start formal screening. We will revise until we achieve a percentage agreement of >80% across reviewers. The search decision process will be illustrated using a flow chart, as recommended in the PRISMA statement [40].
Data charting process
Based on the population, concept, context and types of sources of evidence as outlined in Table 1, the research team will develop a data extraction form using spreadsheets. Prior to the full data extraction, we will pilot the data extraction form using a sample of 10 studies to determine agreement within the research team, and as such, this will be an iterative process. Two review authors will then independently read and extract data from each included study using this data extraction form. In line with the purpose of this scoping review we plan to extract the following data:
- Population: study population (student, clinician, faculty), age, sex, level of education (year of study, undergraduate, postgraduate), inclusion and exclusion criteria, needs assessment (e.g. equipment, human resources), number of participants in intervention group/control group, sample size, data about previous experience with simulation.
- Concept: type of intervention (scenario/task/activity, facilitative approach (pre- and debriefing/feedback), manikin or standard patient intervention, virtual reality), overall aim of the simulation (learning outcomes), type of skills, assessment after training (formative or summative evaluation), pedagogical rationales, integrated in curriculum (yes/no), duration (hours), fidelity (equipment, environmental- and psychological fidelity), settings (educational or healthcare institution or others), and comparator, type of outcomes (educational, patient), cost measures used (yes/no) and type of cost measures.
- Context: type of institution performing the simulation, type of radiography specializations.
- Types of sources of evidence: title, year of publication, volume, author, country, study objective/purpose, type of study, research method (design, number of study participants/sample size, data collection), results, conclusions.
Analysis of the evidence
In this review, we aim to present an overview and a narrative account of all studies included. We will present our results in two ways. Firstly, we will quantitatively summarize the data related to the extent, nature and distribution of studies. This simple numerical analysis will provide an overview, and it will point to significant knowledge gaps. Secondly, we will use content analysis [41] to map the different simulation interventions and learning design elements reported (e.g. teaching and learning activities, curriculum, pedagogical theory, assessment strategies and learning outcomes). Reporting guidelines for interventions [e.g. 42] will be used to structure the presentation of the reported interventions. We will use Kirkpatrick’s four-level model [43] as a framework for the analysis of the different learning outcomes reported.