Using a Situated Simulation-Based Program for Improving Students’ Interaction and Observation Skills with Children in Occupational Therapy

Background Occupational therapy education programs need to produce students who can confidently and safely deliver services for children. The study incorporated a simulation of a real situation into a clinical skill observation of a pediatric occupational therapy curriculum. The purpose of the study was to determine whether situated simulation-based program could increase students’ perceived knowledge and clinical skills to better prepare them for pediatric practice. Methods The authors introduced a situated simulation-based program with video-based simulation training and a situated simulation-based exam during a pediatric occupational therapy course for thirty-two students in their fourth year of study in occupational therapy. The simulation program was two video-based simulation training sessions, tasked students with observing, evaluating and managing the children play. The debriefings were provided to connect student’s observation and basic evaluation skills. A post- simulation performance evaluation, the situated simulation-based exam, was created by faculty. The exam was prepared in a situated simulation therapeutic room was held to assess students’ skills in communication and interaction and basic evaluation skills. The scores of the video-based simulation training and the situated simulation-based exam were collected and examined. Results The video-based simulation scores explained 33.3% of the variance of performance in the situated simulation-based exam. The overall passing rates were as follows: situated simulation-based exam, 65.6%; communication and interaction station, 53.1%; basic evaluation station, 68.8%. Conclusions The video-based simulation training enhanced students’ communication and interaction skills. More relationship building skills were facilitated within the situated simulation-based exam with a real environment. The strategies that assist successful

implementation of a situated simulation program to facilitate learning include course plans, clear scenario training goals, evaluation quality, and situated simulation contexts.
This study provides preliminary support for simulation-based programs as training for improving the clinical skills of interaction and observation before students' internships.

Background
Health professionals increasingly are turning to simulation-based education (SBE) to provide students effective learning environments in authentic and context-specific clinical situations so that they can gain the professional knowledge, skills and attitudes that contribute to competency and clinical reasoning 1,2 . Several elements have been highlighted in simulations to support the development of learning and facilitate situated cognition, including mannequins or standardized patients (SPs); situated simulations of authentic environments, such as wards and clinical environments; and the use of equipment, videos, or interactive computer packages to recreate the required reasoning and actions of specific practice situation [3][4][5][6] . These simulation elements can be arranged in a task-based learning program incorporating an environment with multiple changing scenarios according to the content and fidelity level of the teaching and learning activity 1 . A situated simulation-based program features an authentic environment to imitate real things, patients or clinical processes 7 , and the materials (objects, texts, technologies, settings) move during the practice and learning to increase the fidelity of the experiences 8 .
Increasing evidence has shown that practices related to SP simulation enhance student learning outcomes and clinical reasoning abilities by requiring real-time decisions 2 .
However, the literature on simulations in occupational therapy and evidence supporting its use and its effectiveness in the classroom are limited to a few published articles [9][10][11][12][13][14] .
Collectively, these articles report administered experiences with simulations in occupational therapy for physical disability and psychiatric patients. However, the literature provides limited direction on the use or best practices for simulations in pediatric occupational therapy education, possibly due to problems with the reliability and consistency of using children as the SPs to play patient roles 15 . Another reason could be that using children as the SPs is more expensive and time-consuming than using adults is 16 .
Although using children as the SPs in simulations entails certain difficulties, the fast pace and high complexity of managing patients in pediatric occupational therapy often leaves students in their fieldwork practice feeling challenged and unprepared to practice in such an environment. Because of the professional needs of occupational therapy, students in their educational stage need numerous opportunities to acquire the skills of observation, communication and interaction, and even to play with children, during the evaluation or therapeutic activities. Therefore, a repeatable and harmless form of practice is needed 17 .
However, such communication practice has not been provided by simulators so far.
Therefore, alternative methods of simulation-based education could provide another strategy to consider. For example, the inclusion of video in simulation-based training may be an attractive method for reducing the costs of simulation-based medical education without compromising learning 6,18,19 .
Based on the need of academic programs to train and assess the interaction and evaluation skills of students to prepare them for clinical placement, a situated simulationbased (SSB) program with video-based simulation (VBS) training and an SSB exam were created 20

Research Design
The design of the study employed a quantitative method, which allowed interpretation of participants' clinical skill development in the SSB program. In the first part, the reliability and validity of the simulation program were examined. Then a linear regression model using VBS training scores and SSB exam scores was created to interpret the performance of the students in the SSB program. The basic characteristics of the SSB exam were presented to illustrate the performance of the participants. This study was approved by the Research Ethics Review Board of the institution (project no. C103107). Participants were informed at the initial introduction to the program that participation was voluntary, and oral consent was obtained from participating students and educators.

Program buildup
The SSB program was a 16- Table 1.
The scenarios in the videos in the simulation training program were developed by two medical instructors to meet training goals and skill requirements. Two training videos, child play and social interaction, were created. Each video was approximately 10 minutes.
A clinical observation form was designed for practice.

Instruments
Situated simulation exam A situated simulation therapeutic room (about 7*10 square meters) was prepared to resemble a pediatric occupational therapy clinical treatment room. It was covered with thick mats to prevent falling. Therapeutic equipment, such as a ball pool, suspension system, wedges, rollers, and balls, was prepared. In the middle of the room was a small desk for the child SP, who was a 5-year-old kindergarten child with typical development.
Before the exam, the SP and his or her parents provided informed consent. Then an examiner introduced the exam and the clinical task to be carried out.
The exam included two connected stations manned with a standardized patient for evaluating the students' communication, observation and interaction skills (CI, station 1) and basic evaluation skills (BE, station 2). A checklist with twelve items was developed.
Each item was scored on a three-point rating scale (0: totally not completed; 1: partially completed; 2: completed) with descriptive anchors to provide a scoring rubric for the scale. A five-point overall score (0: unqualified; 1: borderline; 2: pass; 3: good; 4: excellent) was also applied at each station to evaluate overall performance. The students were asked to imagine that they were newly qualified occupational therapists who had been asked to evaluate a child. Each student had 10 minutes to read the exam scenario outside the station. Then each student was given 10 minutes at each station to interact with the standardized patients. Overall, each student spent 30 minutes to complete the exam.
To establish the item-content validity, an expert group of 4 medical teachers was asked to review the content of the stations and the checklists with a form using a Likert-type scale.
The factors reviewed were intended to clarify whether the rubric included the most essential steps linked to the exam content and covered the essential skills for an occupational therapy student, whether the exam content was in line with the simulationbased curriculum, whether the settings and contexts of the stations were authentic, and whether the instructions to the examinees were clear and unambiguous. Based on the expert feedback, revisions were made to remove ambiguity and present a consistent scoring system to distinguish different levels of the students' clinical skills. Finally, the checklist and scoring rubric of the simulation exam could be considered a valid measure of student performance 21 .
To establish the reliability and validity, item total correlation-Pearson's correlation between the scores obtained by each student at each station and the total score of the same student in the examination was calculated. A station with a correlation coefficient of >.07 was considered to have good reliability. The reliability (internal consistency) of the examination was assessed using Cronbach's α to measure the reliability of the different stations. Cronbach's α was calculated from the total scores on the examination. A value > 0.7 was considered to indicate good reliability 22 . In addition, two raters cooperated to build the interrater reliability. They discussed and checked the checklist in a pretest to reach evaluation consistency. In the SSB exam, the rater at each station graded the students' clinical skills and performance according to a given set of checklists and a scoring rubric. After the exam, another rater watched the video recording of the exam and completed the second rating to establish inter-rater reliability.
Finally, the item-content validity index scores (i-CVIs) of the SSB exam were 0.98 and 0.85 at the item-level; the scale content validity index (S-CVI) of the overall checklist was 0.93 at the scale-level, which was acceptable ( Table 2). The raters at the stations were instructed to be passive evaluators and not to guide or prompt the students.

Data collection and Statistical analysis
The scores of the VBS training and the SSB exam were collected. The marks were tabulated for data analysis in de-identified format and analyzed in SPSS to complete the two parts of the statistical analysis. First, standard linear regressions were calculated to examine the VBS and SSB score correlates and to discover the amount of variance in SSB exam scores that could be explained by the VBS score. Second, descriptive statistics were applied to examine the passing rate and the performance in the SSB exam to explain the students' performance in the program. The situated simulation exam Thirty-two students attended the situated simulation program and completed the SSB exam. The characteristics of each station in the situated simulation exam are shown in Table 1. The average scores of the stations were CI (12.5 ± 1.3) and BE (8.2 ± 1.2). When the borderline group regression method was applied as the standard-setting method, 17 participants passed and 15 participants failed the exam. The passing rates of stations 1 and 2 were 53.1% and 68.8% respectively, and the overall passing rate was 65.6%.

Discussion
The goal of an SSB program is to create an authentic clinical situation to improve practice safety and develop clinical skills. The main interests of this study were to determine the correlation of the VBS scores to the SSB exam scores, and students' performance on the SSB exam. The results of this study may provide useful information for future studies on simulation-based education in occupational therapy.
The SSB program differentiated the declarative knowledge and the procedural skills The present study indicated that the SSB program demonstrated a significant effect of the lectures in the rehabilitation domain. The lectures transmitted clinical knowledge from the instructors to the students, who then memorized and utilized the information to achieve high scores. However, this is the process of rote learning. In contrast, the performance outcome of clinical competence, which is emphasized by the medical profession, cannot be evaluated by a written test. Ideally, an outcome measure of simulation education should be assessed on real patients or standardized patients to account for the transfer of learning 23 . As competence is a result of experience 1 , the SSB program provided students with a chance to practice in a situation and to develop competency. This is a process of mastery learning.
Interaction with a real social environment is essential for developing non-technical skills In the study, the simulation course improved the students' relevant skills for evaluating children. However, the results showed that the current VBS training was insufficiently effective in training students' communication and interaction skills. It has been suggested that, to provide high-fidelity simulations having physical and social environments close to those of actual clinical situations, such as by using role-plays, standardized patients could be used to improve the simulation level and thereby provide an authentic interaction experience. It is noteworthy that a post-simulation debriefing 24 , which is a high-quality debriefing and reflection process following the training, is the most important element of the learning process of improving communication and interaction skills. A learning environment with high physical fidelity that provides students with practice, debriefing, and application can develop students' clinical skills, and cognitive fidelity, such as wearing uniforms and carrying identification, can also raise students' awareness of professional behavior and finally enhance the authenticity of the experience 1 .

Child SPs would benefit professional learning in occupational therapy
It is interesting to note that children's voices are gaining more attention and respect in clinical practice. In medical education, however, children's voices are little heard, so content on interacting with children should be included in the curriculum of pediatric occupational therapy. We suggest that simulators or videos be incorporated into training programs for students to acquire technical skills such as observation and evaluation, and that child SPs in the objective structured clinical examination would be beneficial to the learning of skills such as communication and relationship building 16 .

Conclusions
Occupational therapy educators using simulation-based education strive to develop students' clinical competence by strengthening their skills. In this study, it was found that not only evaluation skills, but communication skills may enhance students' clinical competency when interacting with clients. Therefore, in the educational stage, learning objectives should not include only knowledge acquisition and clinical practice; they should also provide situated simulation learning experiences to strengthen the relationship skills.
For example, communicative and observational skills can provide the cognitive fidelity for learning, and physical fidelity programs such as videos can provide safe and repeated practice to approximate clinical competence.
When child SPs are used in programs or exams, it should be noted that the presentation of the SPs could influence the students' performance. Therefore, case scenarios need to be well-designed and carefully rechecked, as well as to correspond the present content and the checklist which if the child SPs could afford the presentation to reach stable validity and reliability.
Several limitations of this study should be noted before attempts to generalize the results to other occupational therapy programs. The current study design was a single-group experimental design, so no control group was available for comparison. The SSB exam was embedded in a specific program curriculum. All students enrolled in the program participated in the SSB program as part of a course requirement; therefore, it was not possible to compare their performances with those of students who participated in a different but complementary educational activity.

Consent for publication
Not applicable.

Availability of data and materials
The data sets analyzed in this study can be obtained from the first and second authors on reasonable request. All aggregated data are reported in the manuscript.

Competing interests
The authors declare that they have no conflicts interest. Authors' contributions CHH took the lead in creating the study methodology, the conception and design of the study, and assessing all the data in the study, and also took responsibility for the integrity of the data and the accuracy of data analysis, organizing the data, and drafting and revising the paper. THH supervised the simulation session processing, data collection, and preparation of the article. Both authors contributed to the revision of the paper and approved the final manuscript for publication. CYL guided the structure of manuscript writing and revising the paper.