Smartphone Use in the Management of Neurological Emergencies: A Simulation-based Study

Abstract


Introduction
Nearly all healthcare professionals use smartphone devices, and there is an increasing number of smartphone applications that are geared towards the medical eld.Despite this, relatively little is known about how health professionals use smartphone applications in their daily practice, how it affects their performance of patient care, reliability of the medical information in these applications, and ultimately patient outcomes.Previous studies have assessed the use of smartphone applications in Accreditation Council for Graduate Medical Education (ACGME) training programs and found that up to 85% of participants (trainees through attending physicians) used a smartphone with their most common application usage being drug guidelines followed by medical calculators and coding and billing apps. 1 A study of smartphone applications in neurology found a variety of applications that could be grouped into the following categories: references, academic, communication, classroom, localization/examination, documentation/administrative, monitoring/analytics, and advising/teaching. 2 Recently, a study assessing the trend of medical application use in neurology found an increasing trend in the use of smartphone applications as medical instruments such as using a smartphone to identify seizure. 3,4her studies have shown that neurology trainees use their smartphones more frequently than attending physicians and often use them for patient-care related activities. 5This suggests that providers and practitioners with less experience reach for their smartphone more often as a resource to aide in clinical decision-making.It is vital that medical applications used for reference must be evidence-based.However, to our knowledge there have not been studies validating the e cacy of smartphone applications in medical practice despite the nearly universal use.
We sought to identify and de ne smartphone use during acute neurological emergencies, using simulated cases to standardize the clinical events.We hypothesized novice participants would use smartphones more often, and that smartphone use would be associated with better clinical performance as participants would be able to have a clinical reference guide.

Setting and Study Design
This is a retrospective review of a prospective, single-center simulation study performed between February 2018 and October 2023.It was conducted by three neurointensivists (MBP, WWC, and NAM).Each simulation was recorded for video review by the raters.Participants ranged from neurology subinterns to attending physicians.They voluntarily participated in the study as individuals and did not receive any information about the cases prior to participating.All participants completed a questionnaire prior to the simulation cases which included questions regarding demographics, level of training, primary work environment, prior experience, and self-rated pro ciency regarding neurological emergencies.Each simulation lasted between 20-30 minutes.In a pre-brie ng, participants were instructed to approach the cases just as they would in real life and to use any resources that they would normally use to assist them, including a smartphone.After each scenario was completed, participants partook in a debrie ng session with the attending neurointensivist that ran the simulation.The debrie ng session followed the Debrie ng with Good Judgment approach to help participants re ect on their behavior during the simulation leading to behavioral change. 6Simulation scenarios took place while trainees were on their neurocritical care rotation in which they were excused from clinical duties.They did not receive nancial compensation for their participation.Their performance in the simulation scenarios had no bearing on clinical performance grading or evaluations.

Clinical Simulation Case Scenarios
Clinical simulation cases and critical action checklists to assess performance during the simulations were developed using a modi ed Delphi method as previously described. 7Cases included hemorrhagic conversion of acute ischemic stroke complicated by hemorrhagic transformation, status epilepticus in the setting of viral encephalitis, traumatic brain injury followed by herniation syndrome, subarachnoid hemorrhage with early neuro-worsening, cardiac arrest followed by status epilepticus, and spinal cord compression.Critical action item checklists were based on the Emergency Neurological Life Support (ENLS) protocols and cross referenced with relevant guidelines from the American Heart Association, American Academy of Neurosurgeons / Congress of Neurological Surgeons, the Brain Trauma Foundation, the American Epilepsy Society, the Infectious Disease Society of America, and the Neurocritical Care Society as previously described. 7mulator and Simulation Environment We used the SimMan 3G manikin (Laerdal; Wappinger Falls, NY) that has both neurological and physiological signs including pupillary constriction, respiratory patterns, and tonic/clonic movements representing seizure.The manikin speaks through an internal speaker, and there is a nurse embedded at bedside who can offer other ndings of the neurological examination including eye movements, motor, sensory, and cerebellar functioning.A monitor displays pertinent data related to the case including lab data, electrocardiogram (ECG), and neurodiagnostics including various intracranial imaging studies and electroencephalogram (EEG).There is also a bedside monitor which displays vital sign data including telemetry, oxygen saturation, arterial blood pressure, and temperature.
The simulation room represented either an emergency department (ED) room or a room on a medical/surgical oor.All equipment and medications needed to perform the scenarios were available to participants including rapid sequence intubation medications, anti-seizure medications, hyperosmolar therapy, and intravenous uids though not readily in view.Equipment for airway management included bag valve mask, nasal cannula, non-rebreather, endotracheal tubes, Bougie, laryngoscope, and a ventilator.An embedded participant was available to manage the airway for those participants not trained in airway management.Participants also had access to a lumbar puncture kit and continuous video EEG.
In a pre-brie ng, participants were oriented to the simulator and environment by the simulation operators (MBP, WWC, NAM) using a pre-brie ng script.Participants were asked to verbalize any orders, components of the physical or neurological examinations, and any diagnostics.We asked participants to perform all expected skills that they would perform within their skillset and level of training.
Consultations could be placed from a telephone in the simulation room with the simulation operator acting as the consultant.The embedded nurse, who had constant communication with the simulation operator through an earpiece, was in the room with the participant through the entirety of the simulation scenario.
All smartphone use during the simulations were recorded either in realtime during the simulation or via video recording by the raters, MBP and AAA.

Outcomes
The primary outcome was the percent of participants using their smartphone during the simulations.Secondary outcomes included if participants arrived at the correct answer or critical action when using their smart phone based on use of their smartphone and then subsequent actions, smartphone use purpose divided into 4 categories including medication dosing, management guidelines, hospital/society protocols, and standardized exam scales including the National Institute of Health Stroke Scale (NIHSS) and Glasgow Coma Scale (GCS).Additionally, we assessed if participant level was correlated with use of smartphones during the simulation.To do this, we strati ed participants into 3 levels of training: novice, intermediate, and advanced.Novice participants including neurology sub-interns and neurosurgery interns who have minimal experience with neurological emergencies at our institution.Intermediate participants included neurology residents and all critical care fellows besides neurocritical care fellows as they have rotated through the neurointensive care unit at our institution and have experience in initial work-up through consultation of a multitude of neuro-emergencies in our institution.Advanced participants included neurocritical care fellows, stroke fellows, and attending physicians in neurocritical care and vascular neurology who were all board-certi ed in neurology with all attendings having subspecialty training in their respective elds.We also assessed performance during the simulations using a checklist of critical action items as previously described. 7

Statistical Analysis
We reported descriptive statistics as mean (standard deviation [SD]) for continuous variables and counts and frequencies for categorical variables.We performed a Chi-square test to assess differences in smart phone use based on level of training.We then used a t-test to determine if there was a difference in performance between participants who used smartphones and those who did not.

Standard Protocol Approvals, Registration, and Patient Consents
This study was approved by the University of Maryland, Baltimore, Institutional Review Board (IRB), which waived the need for informed consent.much as intermediate trainees.Perhaps they lacked the baseline knowledge to recognize what they needed to use their smartphones for during the clinical scenario.Expert participants used their smartphones the least.Thus, trainees who are most likely to see patients as the rst patient interaction prior to discussing or seeing patients with more senior providers were the ones most likely to use their smartphones and in need of reputable and evidence-based clinical management tools.Consequently, these trainees have the potential to bene t most.However, despite greater use of smartphones in this group, our study showed that use of a smartphone during the simulation encounter did not confer any bene t in their simulated clinical performance.We hypothesize two potential explanations for this.First, the resources accessed may have provided incorrect information from unveri ed sources.Though speci c smartphone resources that participants used in the simulation were not captured as part of data collection, we did query participants during the debrie ng sessions.Participants reported a variety of resources ranging from a simple query of search engines such as Google to more targeted resources such as Micromedex or guidelines and management protocols on trusted sites (Emcrit.org was often sited particularly from critical care fellows).In our debrie ng sessions, we also found that resources varied by participant level with more senior trainees accessing national guidelines that are evidence-based and peer-reviewed literature more than junior trainees who often utilized the rst hit from a search engine without vetting.
Second, even when they received factually correct information from the smartphone query, intermediatelevel providers struggled with matching the correct information to the appropriate setting.For example, when searching for the appropriate dose of Tenecteplase for acute ischemic stroke, many trainees gave the dose for acute myocardial infarction, as that is the rst dosing that comes up when using a search engine.This could lead to both overdosing or underdosing and resultant complications.This is a common problem found in hospitalized patients where medication errors occur at a rate of up to 90%. 9 Smartphone use may, in this context, yield a false sense of con dence to non-expert learners.
Regardless of whether it was due to nding incorrect information or failing to correctly integrate the information received, our study demonstrated that smartphone queries resulted in an incorrect medication dose or treatment plan 27% of the time.Further studies should focus on development of smartphone applications that are evidence-based, regularly updated, and easily accessible to users.
Additionally, given the reported use of smartphones by trainees in the literature 1 and in our study, further initiatives should be made to incorporate smartphone application training into medical education.
Our study has several limitations.It is a single center study that may not represent trainees and attending physicians at large.We relied on simulated cases as opposed to real-life neurological emergencies; however, doing so allowed us to standardize the clinical scenarios among participants.2][13] We did not gather a comprehensive list of all the resources used by participants nor did we have a way of digitally tracking smartphone use; however, we did explore smartphone use during debrie ngs.Future studies should delve into the reliability of resources available to the medical community.

Conclusion
Smartphone use was frequent during a variety of neurological emergencies, especially among intermediate participants.Our ndings suggest a need for development of an evidence-based smartphone application for neurological emergencies, and future research should assess clinical performance with and without use of this evidence-based application.

Declarations
This manuscript complies with all instructions for authors.This manuscript has not been published elsewhere and is not under consideration by another journal.
There was adherence to ethical guidelines during completion of this research, and this study was approved by the University of Maryland, Baltimore, Institutional Review Board (IRB), which waived the need for informed consent.The STROBE checklist was followed.
Shows the total number of smartphone uses by identi cation of use if able to ascertained.
Percent of participants using smartphones by level of training.
Performance of intermediate participants comparing those who used smart phones to those that did not use smartphones.

Funding:
This was funded by the Faculty Innovation in Education Award from the American Board of Psychiatry and Neurology (Dr.Morris).Disclosures: The authors have no relevant disclosures to report.CRediT Author Statement and Contributions: MBP: Conceptualization, Methodology, Investigation, Data Collection, Formal Analysis, Writing -Original Draft, Writing Review and Editing AAA: Data Collection, Writing -Original Draft, Writing Review and Editing WWC: Data Collection, Writing -Original Draft, Writing Review and Editing BN: Data Collection, Writing -Original Draft, Writing Review and Editing CA: Writing -Original Draft, Writing Review and Editing AA: Writing -Original Draft, Writing Review and Editing ST: Writing -Original Draft, Writing Review and Editing