Customized Mobile App for Residents Rotating Through Pediatric Critical Care Unit

DOI: https://doi.org/10.21203/rs.3.rs-1206348/v1

Abstract

Introduction: Physicians use mobile apps for patient care, but few are dedicated to pediatric critical care medicine (PCCM). We developed an easy-to-use customized mobile app for PCCM residents. Our objectives were to evaluate whether this mobile app will improve residents' confidence in PCCM knowledge and comfort level in the pediatric intensive care unit (PICU).

Methods: We recruited 90 residents from March 2020 to April 2021 for this prospective survey-based, block-randomized, single-center study with a pre-post study design. Participants completed 20-question quizzes at the beginning and end of the rotation. T-test was used to compare the pre-post quiz score difference between the two groups of residents (those with and without the app). At the end of the rotation, subjects also completed a survey with 5-point Likert scale items to compare their comfort level in PICU and confidence in PCCM knowledge pre- and post-rotation.

Results: There was a significantly increased improvement in the post-block quiz from the pre-block quiz in the mobile app group compared to the control group (an increase of 0.23 questions vs 1.67, p=0.045). There was a trend of increased improvement in confidence in pediatric critical care knowledge for the App group; however, the difference did not reach statistical significance by Pearson's Chi-square test (p=0.246). Similarly, there was no statistical difference between the two study groups for change in Comfort Level in PICU from the baseline level.

Conclusion: Implementation of a service-specific mobile app may enhance residents' clinical experience and improve self-efficacy. Further investigation is warranted.

Introduction

In this new era of advanced technology in which smartphones are ubiquitous, increasing numbers of medical providers are utilizing mobile applications (apps) for patient care. Although the number of medical device platforms available for download is expanding rapidly, few are dedicated to pediatric critical care. It is required for all pediatric and most emergency medicine residents to rotate through the pediatric critical care unit (PICU) during their training. Most of these residents have no pediatric critical care experience and find the rotation challenging with demanding hours and patients with high acuity. In 2017, Wolfe and Unti reported that pediatric residents suffer from burnout during their PICU rotation, more so compared to other pediatric rotations (65% vs 45%)(1). They also demonstrated that pediatric residents screen positive for depression more at the end of the rotation compared to the beginning (42% vs 4%).

In 2011, Franko and Tirrell surveyed providers at medical centers recognized by the Accreditation Council for Graduate Medical Education (ACGME) regarding smartphone app use(2). From the data, half of the participants use mobile apps in their clinical practice. In addition, medical trainees responded they desire apps tailored to medical training and encourage self-directed learning. Since 2011, more studies have been published to reaffirm this keen interest in using smartphone apps for medical education among medical trainees(35). There is also an overall positive perception of mobile medical apps on improving clinical knowledge(68). However, few studies exist to quantify the impact of mobile apps on resident education; furthermore, these studies are limited by inadequate or lack of a control group.

We developed a customized mobile app targeting medical residents rotating through the PICU at Children's Hospital of Michigan (CHM). The app was created using an online app builder (https://thunkable.com/). This service-specific app includes orientation materials, divisional policies and guidelines, landmark articles on relevant PICU topics, on-call and lecture schedules, a phone directory linked to direct dialing, medication dosing quick references, pediatric vital signs reference charts, and medical calculators (Figure 1). We developed this app for pediatric and emergency residents rotating through the PICU at Children’s Hospital of Michigan in Detroit, Michigan. We hypothesize that the use of this novel app would improve residents’ medical knowledge, confidence, and comfort level as a physician in PICU.

Methods

Study Design

This is an observational, prospective, survey-based, block randomized, single-center, pre-post exploratory study conducted between March 2020 to April 2021. We recruited ninety pediatric and emergency medicine residents at the beginning of their PICU block. Each block was randomly assigned to either the control or the App group. Block randomization was used to avoid cross-contamination, and the randomization was done using an online randomizer website (). The control group consists of trainees that do not have access to the mobile app, and the App group consists of trainees that do have access to the smartphone app. The customized mobile app “CHM PICU” was developed using an online app builder called Thunkable (). Residents from the App group were given a link to download the app, and they were free to use this app any time during their rotation. Neither the proportion of residents who did download the app nor the use of the app was monitored. Medical residents rotating in PICU as an elective rotation were excluded from the study.

The Wayne State University Institutional Review Board approved this study under an expedited review process. All methods were carried out in accordance with relevant guidelines and regulations. Inform consent was obtained from all subjects.

Knowledge Assessments

Residents completed a 20-items pre-rotation quiz focusing on important pediatric critical care concepts when they enrolled in the study. Participation in the study was entirely voluntary. At the end of the rotation, the participants completed an anonymous survey and a post-rotation quiz. The two quizzes were similar in their contents and had 20 questions each. Before subject enrollment, we conducted focus groups with PICU residents for face validity for the knowledge assessment quizzes. The quizzes were labeled for each participant, which allowed us to calculate the change from baseline in the quiz score for each resident.

Surveys

We assessed face validity for the two surveys—one for each study group—by performing focus groups with PICU residents, fellows, and attendings prior to resident enrollment for the study. Participants complete the survey anonymously at the end of their PICU rotation. Both surveys included demographic questions on age, gender, discipline of residency program (pediatric or emergency medicine), level of training (2nd or 3rd postgraduate year), and whether it was the first time the resident worked in PICU. The surveys also included questions to assess residents' comfort level in PICU and confidence in their pediatric critical care knowledge pre- and post-rotation. The questions were in a 5-point Likert response format on individuals' level of agreement to the statement. The survey for the App group had additional 14 questions relating to CHM PICU App, including how often the residents used the CHM PICU app during the rotation, whether residents find the app helpful in completing everyday tasks, whether the app made transitioning into the rotation easier, and whether the app helped with overall communication among team members. Survey answers were compared between two study groups to gauge whether the app made a difference in improving the resident's comfort level in PICU and their confidence in pediatric critical care knowledge (Supplements 1&2). 

STATISTICAL ANALYSES

To assess the study's primary objective of determining improvement in residents' pediatric critical care knowledge, we calculated each resident’s paired pre-post quiz score difference and compared the results between the two study groups. Secondary objectives were to test the effectiveness of the CHM PICU app in improving residents' self-efficacy and comfort level in PICU. 

We used a non-parametric Fisher's Exact test to compare the demographic variables between the two study groups. As continuous variables from 0 to 20, assessment knowledge scores were compared using an independent sample t-test. Categorical and ordinal data were compared with Pearson's chi-square test. Adjusted multiple linear and ordinal regression models were used for secondary outcomes and sensitivity analysis to account for confounding factors. All reported p-values were two-tailed with statistical significance at p-values ≤0.05.

Results

Demographics

Ninety residents enrolled in the study. The control group had 50 (55.6%) participants, and the app group had 40 (44.4%) participants. There is an imbalance in the number of participants who were able to complete both knowledge assessments. We were able to capture most of the App group residents (n=37, 92.5%), however only 68% (n=34) of the residents in the control group completed both quizzes (p-value = 0.005). Demographic variables, including gender, residency, level of training, and whether it was the resident's first time in PICU, were reported using ratios and percentages (Table 1). A non-parametric Fisher's Exact Test was used, and there were no statistically significant differences between the two study groups on demographic variables. 

 
Table 1

Demographic Variables and Easy to Adapt variable for the Two Study Groups

   

Control

(N=50)

Treatment (N=40)

P-Value

Total (N=71)

Gender

Female

25 (51.0%)

24 (49.0%)

0.293

49

Male

25 (62.5%)

15 (37.5%)

40

Residency

Peds Resident

27 (51.9%)

25 (48.1%)

0.520

52

EM Resident

23 (60.5%)

15 (39.5%)

38

Level Training

PGY2

24 (48.0%)

26 (52.0%)

0.162

50

PGY3

26 (65.0%)

14 (35.0%)

40

First Time PICU

Yes

34 (52.3%)

31 (47.7%)

0.353

65

No

16 (64.0%)

9 (36.0%)

25

Easy to Adapt

SA/A

21 (45.7%)

25 (54.3%)

0.003*

46

Neutral

14 (50.0%

14 (50.0%)

28

SD/D

15 (93.8%)

1 (6.3%)

18

SA/A: Strongly agree or agree SD/D: Strongly disagree or disagree

Demographic variables and Easy to Adapt variable from survey data. Fisher’s Exact test was performed for 2 x 2 tables. Person’s Chi-square test was calculated for tables extending beyond a 2 x 2 table. There were no statistically significant differences between study groups on demographic variables.

Knowledge Assessment

Change from baseline in the quiz score (score range, 0 [lowest possible score] to 20 [highest possible score]) were examined for both study groups. Pre- and post-quiz scores and the difference in individual's pre-post test scores were compared at each time point between groups using parametric independent samples t-tests (Table 2). Homogeneity of variance was checked and verified using a Levene test. The baseline knowledge between the control and App groups was not significantly different (quiz scores of 11.24 vs 11.14; p = 0.87). However, the post-test scores between the two study groups were statistically different (quiz scores of 11.47 vs 12.81; p=0.05). Similar results were found when comparing the mean pre-post score difference in the control and App groups (p =0.045). The change in average number of questions answered correctly for the app group significantly increased by 1.67 questions from baseline (95% CI [0.91, 2.76]) compared to the control group. In contrast, the control group's average improvement of 0.23 questions after the rotation was not statistically significant by 95% confidence interval (CI [-0.66, 1.45]). 

 
Table 2

Mean Quiz Score between Study Groups

   

Control Group

(N=34)

App Group (N=37)

P-value

Pre-Test (Baseline)

Mean (SD)

11.24(2.63)

11.14(2.54)

.871

95% CI

10.33-12.21

10.20-11.91

 

Median

11.00

11.00

 

Minimum

5.00

5.00

 

Maximum

7.00

15.00

 

IQR

4.50

4.00

 

Post Test (Follow-up)

Mean (SD)

11.47(2.89)

12.81(2.78)

0.05*

95% CI

10.70-12.62

11.95-3.83

 

Median

12.00

14.00

 

Minimum

6.00

7.00

 

Maximum

17.00

17.00

 

IQR

3.50

4.00

 

Mean Pre-Post Score Difference

Mean (SD)

0.23(3.07)

1.67(2.87)

0.045*

95% CI

-0.66-1.45

0.91-2.76

 

Median

0.00

2.00

 

Minimum

-5.00

-6.00

 

Maximum

8.00

8.00

 

IQR

4.00

2.75

 

We performed sub-group comparisons on mean difference quiz scores for the App group. Those who had trained in PICU before had more improvement in quiz scores (improvement of 3.83 ± 2.48 questions) than those who were training for the first time in the PICU (improvement scores of 1.25 ± 2.78 questions). The mean difference of 2.58 ± 1.22 questions was significant (p = 0.04; 95% CI [0.09, 5.06]). Multiple linear regression was used to predict quiz score coefficient change by inserting gender, residency, level of training, and first time in PICU as potential predictor variables. The single best predictor of mean difference in score was whether the current rotation was the first in the PICU. On average, those who had prior training in PICU, as opposed to those who had no prior PICU experience, had a mean increase in quiz score of 2.41 questions (p = 0.006). Whether the subject was in the study group was the second-best predictor of mean difference quiz score. On average, the App group had a mean increase in quiz score of 1.53 questions (p = 0.024).

Survey Data

Pearson's Chi-square test was performed to examine proportional differences in individual responses between the two groups on whether respondents agreed or disagreed on confidence, comfort, and ease of adapting to PICU as a resident. Changes in confidence and comfort interval on a 5-point Likert scale were reported descriptively, with percentages graphed at each point between the study groups (Table 3 & Figure 2). When asked the question "How confident did you feel in your pediatric critical care knowledge?" at the beginning of the rotation vs at the end, three subjects (6%) in the control group (n=50) had lower confidence at the end of the rotation, whereas there was no one with a negative change in confidence level in the App group. Although there was a trend of increased improvement in confidence in pediatric critical care knowledge for the App group, the difference did not reach statistical significance by Pearson's Chi-square test (p=0.246). Similarly, there was no statistical difference between the two study groups for change in Comfort Level in PICU from the baseline level. 

 
Table 3

Change in Confidence Level in PCCM knowledge and Change in Comfort Level after the Rotation

 

Change in Likert Scale

Control Grp

(N=50)

Treatment Grp (N=40)

p-value

Total

(N=90)

Change in Confidence Level in PCCM knowledge after the Rotation

-2

1 (100.0%)

0 (0.0%)

0.246

1

-1

2 (100.0%)

0 (0.0%)

2

0

6 (54.5%)

5 (45.5%)

11

+1

25 (62.5%)

15 (37.5%

40

+2

16 (47.1%)

18 (52.9%)

34

+3

0 (0.0%)

2 (100.0%)

2

Change in Comfort Level after the Rotation

0

11 (61.1%)

7 (38.9%)

0.776

18

+1

19 (52.8%

17 (47.2%)

36

+2

16 (59.3%)

11 (40.7%)

27

+3

4 (50.0%)

4 (50.0%)

8

+4

0 (0.0%)

1 (100.0%)

1

For the survey question "Was easy to adapt to the PICU service as an incoming resident?" on a 5-point Likert scale, the App group thought it was much easier to adapt to PICU than the control group (p=0.003). Only 1/40 (2.5%) residents in the app group found it challenging to adapt to the PICU service compared to 15/50 (30%) residents in the control group who felt the same way.

Adjusted ordinal regression models were performed for the changes in confidence in PCCM Knowledge and Comfort Level after the rotation from baseline in a 5-point Likert scale. The Level of Training was highly significant for change in Comfort Level after the rotation. Compared to PGY3, the log odds for change in comfort scores increased by 1.93 for the PGY2 group. However, no significant difference was noted in the change in comfort score pre-post by study groups.

The survey for the App group also included questions relating to the CHM PICU app. Overall, residents frequently used this app throughout the rotation and gave the app favorable feedbacks, with an 82.5% satisfaction rate. Among the App group residents, 90% thought the app was easy to navigate, 87% thought the app improved the on-boarding process to PICU rotation, 85.5% acknowledged that the app was helpful, and 63% thought the app improved intradepartmental communications. Seventy percent of the residents said they learned PCCM topics from this app. When asked to estimate how much time a day was saved using the CHM PICU app, 40 users responded. Four (12.9%) thought only a few minutes, 6 (19.4%) approximated 5-15 minutes, 16 (51.6%) responded around 30 minutes, and 5 (16.1%) answered an hour was saved per day when using the app. Users commented that the most useful feature in the app is the protocol/guideline section (25, 62.5%), followed by the phone directory (11, 27.5%).

Participants were asked the questions "How confident did you feel in your pediatric critical care knowledge?" and "How comfortable were you with completing everyday tasks and patient care in PICU" at the beginning versus at the end of the rotation. Changes in confidence and comfort interval on a 5-point Likert scale (from 1 being "very confident/comfortable" to 5 being "not confident at all" or "very uncomfortable") were reported descriptively, with percentages graphed at each point between the study groups.

Discussion

Smartphones are now ubiquitous and increasing numbers of physicians are using mobile medical apps to support medical education and professional development, especially among the current generation of learners(915). Implementing a smartphone app for a specific clinical rotation allows residents to access medical information at critical times. Although information is widely available on the internet, it is easy to get bogged down with information overload and time-consuming to filter out inaccurate and outdated information. Despite tremendous resources used by hospitals to create protocols and guidelines, these are inconsistently followed, as many clinicians are unaware of these materials or have difficulty accessing them(16, 17).

A prospective pre-post study found that a smartphone app with the hospital’s antibiogram and treatment guidelines was associated with more improvement in medical trainees’ knowledge of prescribing antibiotics(6). However, this study was limited due to lack of randomization, as trainees enrolled earlier in the study were allocated to the control group, and the latter half was allocated to the App group. Medical trainees accumulate medical experience and knowledge throughout the academic year; having the control group recruited at a different time than the App group could confound the knowledge assessment scores. Chreiman et al. created a mobile platform to assist with the onboarding process for medical trainees starting trauma rotation(18). Around half of the participants felt the app helped in completing everyday clinical tasks (53%) and thought that the app made the transition to the trauma service easier (50%). The study also included usage frequency and found that the app was accessed an average of seven times per day. However, the study was missing a control group, and only 30 trainees completed the anonymous survey at the end of the rotation. Among mobile app studies produced from different subspecialties, many studied the pattern of usage but not the app's effectiveness on medical education. For studies that aim to examine the impaction of the medical app on the learners, most lack a control group and are limited by small sample sizes(5, 19, 20). Additionally, no similar study existed for pediatric critical care.

One of our initial concerns was that trainees might become too reliant on the app as information is easily accessible and may cause the trainees not to recall important medical concepts during knowledge assessment tests. However, our study showed the opposite. The App group had more improvement on the post-rotation quiz than the group that did not have access to the app. This could be explained by the frequent exposure to relevant material facilitated by the CHM PICU app, reinforcing recall of essential concepts. Although the increase in self-perceived PCCM knowledge was not significantly different between the two study groups, there was a trend of increased improvement for the residents in the App group. This app improved pediatric critical care knowledge among residents effectively; however, the residents may not have felt its impact.

Indirectly, we hope to improve residents' PICU experience and reduce resident burnout and depression using the CHM PICU app. However, no significant difference was found in the increase in Comfort Level in PICU between the two study groups. Likely, the app was not effective in preventing resident burnout and depression. Nevertheless, the app was favorably received by its users. Most agreed that the app facilitated transitioning to a new service and helped perform daily clinical duties in the PICU. Compared to the control group, the App group found it much easier to adapt to PICU service as an incoming resident (p=0.003; Table 1). Around half of the users estimated that 30 minutes was saved per day using the app.

The strength of this study is that it is a prospective study with block randomization. Block randomization was chosen to avoid the residents within the same rotation sharing the app. This study is also the first to investigate whether a customized mobile app effectively improved resident medical knowledge and experience in a pediatric critical care unit.

Several limitations exist for this project. Since the post-block quiz was conducted at the end of the rotation, it is unknown whether the improvement in knowledge in the App group is maintained long-term. The second limitation is the imbalance in the percentage of residents who completed both quizzes in each study group. Only 34/50 (68%) in the control group completed both quizzes, and in contrast, 37/40 (92.5%) of residents in the App group completed both quizzes. The reason for this discrepancy is unclear, but a selection bias is possible as a result. Adjusted multiple linear regression model and subgroup analyses for test score improvement were performed to account for this confounding variable. Another limitation is the validity testing of the quizzes and questionnaires; both were created specifically for this study. We only performed face validation on quizzes and surveys by conducting informal focus group sessions with our PICU residents, fellows, and attendings before launching the study. Other forms of validation were not performed.

Conclusion

Although there was no significant difference in the improvement in confidence level between the two groups at the end of the PICU block, the App group outperformed the control group on the knowledge assessment test. This could be explained by the frequent exposure to relevant material facilitated by the app, reinforcing recall of essential concepts. Residents also found it easier to adapt to PICU service as an incoming resident by using this customized app. Implementing a service-specific mobile app is a novel and innovative tool to deliver medical education. Considering residents' busy schedules and fast-paced work environment, introducing an intuitive platform for residents their self-efficacy and experience while on clinical rotations. Further investigation is warranted.

Abbreviations

PICU- pediatric critical care unit

PCCM- pediatric critical care medicine

CHM- Children’s Hospital of Michigan

PGY- postgraduate year

App- Application

Declarations

1. Ethics approval and consent to participate

The Wayne State University Institutional Review Board approved this study under an expedited review process. All methods were carried out in accordance with relevant guidelines and regulations. Inform consent was obtained from all subjects.

2. Consent for publication

The subjects involved are aware of the planned publication, and informed consent was obtained from participants. 

3. Availability of data and materials

Data generated or analyzed during this study are included in this published article and its supplementary information files.

4. Competing interests

The authors have no conflicts of interest to disclose

5. Funding:  

The authors were awarded funding by the Sarnaik Endowment Grant

6. Authors' contributions:

  1. Yu-shan Tseng, MD, contributed to the conception and design of the study, contributed to the acquisition and interpretation of the data, drafted the manuscript and prepared the figures, and critically revised the manuscript.
  2. Ronald Thomas, PhD, contributed to the analysis of the data and critically revised the manuscript.
  3. Ajit Sarnaik, MD, contributed to the conception and design of the study and critically revised the manuscript.
  4. All authors reviewed the manuscript and gave final approval, and agreed to be accountable for all aspects of work ensuring integrity and accuracy.

7. Acknowledgements

Not applicable

References

  1. Wolfe KK, Unti SM. Critical care rotation impact on pediatric resident mental health and burnout. BMC Med Educ. 2017 Oct 5;17:181.
  2. Franko OI, Tirrell TF. Smartphone app use among medical providers in ACGME training programs. J Med Syst. 2012 Oct;36(5):3135–9.
  3. Johansson PE, Petersson GI, Nilsson GC. Personal digital assistant with a barcode reader--a medical decision support system for nurses in home care. Int J Med Inf. 2010 Apr;79(4):232–42.
  4. Christensen S. Evaluation of a nurse-designed mobile health education application to enhance knowledge of Pap testing. Creat Nurs. 2014;20(2):137–43.
  5. Man C, Nguyen C, Lin S. Effectiveness of a smartphone app for guiding antidepressant drug selection. Fam Med. 2014 Sep;46(8):626–30.
  6. Fralick M, Haj R, Hirpara D, Wong K, Muller M, Matukas L, et al. Can a smartphone app improve medical trainees’ knowledge of antibiotics? Int J Med Educ. 2017 Nov 30;8:416–20.
  7. Monroe KS, Evans MA, Mukkamala SG, Williamson JL, Jabaley CS, Mariano ER, et al. Moving anesthesiology educational resources to the point of care: experience with a pediatric anesthesia mobile app. Korean J Anesthesiol. 2018 Jun;71(3):192–200.
  8. Curran V, Matthews L, Fleet L, Simmons K, Gustafson DL, Wetsch L. A Review of Digital, Social, and Mobile Technologies in Health Professional Education. J Contin Educ Health Prof. 2017;37(3):195–206.
  9. Collins F. How to fulfill the true promise of “mHealth”: Mobile devices have the potential to become powerful medical tools. Sci Am. 2012 Jul;307(1):16.
  10. Ellaway RH, Fink P, Graves L, Campbell A. Left to their own devices: medical learners’ use of mobile technologies. Med Teach. 2014 Feb;36(2):130–8.
  11. Wallace S, Clark M, White J. “It’s on my iPhone”: attitudes to the use of mobile computing devices in medical education, a mixed-methods study. BMJ Open. 2012;2(4):e001099.
  12. Baumgart DC. Smartphones in clinical practice, medical education, and research. Arch Intern Med. 2011 Jul 25;171(14):1294–6.
  13. Boulos MNK, Wheeler S, Tavares C, Jones R. How smartphones are changing the face of mobile and participatory healthcare: an overview, with example from eCAALYX. Biomed Eng Online. 2011 Apr 5;10:24.
  14. Carter-Templeton HD, Wu L. Using Mobile Technologies to Access Evidence-Based Resources: A Rural Health Clinic Experience. Nurs Clin North Am. 2015 Sep;50(3):595–603.
  15. Nasser F BinDhim, Trevena L. There’s an App for That: A Guide for Healthcare Practitioners and Researchers on Smartphone Technology. Online J Public Health Inform. 2015;7(2):e218.
  16. Evans CT, Rogers TJ, Burns SP, Lopansri B, Weaver FM. Knowledge and use of antimicrobial stewardship resources by spinal cord injury providers. PM R. 2011 Jul;3(7):619–23.
  17. Mermel LA, Jefferson J, Devolve J. Knowledge and use of cumulative antimicrobial susceptibility data at a university teaching hospital. Clin Infect Dis Off Publ Infect Dis Soc Am. 2008 Jun 1;46(11):1789.
  18. Chreiman KM, Prakash PS, Martin ND, Kim PK, Mehta S, McGinnis K, et al. Staying connected: Service-specific orientation can be successfully achieved using a mobile application for onboarding care providers. Trauma Surg Acute Care Open. 2017;2(1):e000085.
  19. Barnes J, Duffy A, Hamnett N, McPhail J, Seaton C, Shokrollahi K, et al. The Mersey Burns App: evolving a model of validation. Emerg Med J EMJ. 2015 Aug;32(8):637–41.
  20. Dimond R, Bullock A, Lovatt J, Stacey M. Mobile learning devices in the workplace: “as much a part of the junior doctors” kit as a stethoscope’? BMC Med Educ. 2016 Aug 17;16(1):207.