All procedures performed in this study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. The study was approved by the Institutional Review Board at University of California, Davis (No.1865834).
The iBehavior mobile application. iBehavior is a smartphone-based (iPhone or Android) eEMA app designed for use by caregivers to assess maladaptive and prosocial behaviors of people with IDDs, with a focus on its future application as an outcome measure for clinical trials. iBehavior content and function development was informed by a panel of stakeholders (parents, teachers, clinicians, researchers, and an FDA representative; see Acknowledgments) invested in the assessment and treatment of children with IDDs, and by a Delphi study focused on Down syndrome, fragile X syndrome, and autism. The Delphi panel agreed on six domains to include in iBehavior (see Table 1).
Table 1
Means and Standard Deviations of iBehavior Domains
iBehavior Domain | Mean | SD |
Aggression and Irritability Frequency | 0.42 | 0.54 |
Aggression and Irritability Intensity | 0.43 | 0.27 |
Restricted and Repetitive Behaviors Frequency | 0.63 | 0.50 |
Restricted and Repetitive Behaviors Intensity | 0.58 | 0.42 |
Avoidant and Fearful Frequency | 0.16 | 0.16 |
Avoidant and Fearful Intensity | 0.17 | 0.17 |
Social Initiation Frequency | 1.97 | 0.77 |
Social Initiation Intensity | 2.07 | 0.88 |
Hyperactivity | Domain not included in the present study |
Inattention | Domain not included in the present study |
The iBehavior app was built using Nativescript, a framework for creating native mobile applications that targets multiple platforms (i.e., android and iOS). Data collected through the iBehavior app is encrypted and securely transmitted in real time to REDCap (Research Electronic Data Capture; Harris et al., 2009, 2019), which for this study is hosted and managed by the UC Davis Clinical and Translational Science Center (CTSC). REDCap database linkage is established via a study code which links their device to a REDcap database using REDCap’s application programming interfaces (APIs).
No personal identifying information (PHI) is recorded within the app or transmitted from the user’s device to REDCap. iBehavior content (domain items, language, scaling) can be individually added and/or modified in real time and pushed to the app for users. This feature enables the study team to field test content and respond to feedback in a timely manner. iBehavior also includes automatic, configurable notifications sent via SMS text messaging to remind users to begin their observations, and when to complete ratings.
Currently, each iBehavior domain includes 6–8 items representing discrete types of behavior. For each behavior, the user is prompted to answer a yes/no question about whether the behavior occurred during the observation period. If “yes,” the app prompts the user to record the frequency (“rarely,” “sometimes,” “often,” or “very often”) and intensity (i.e., the degree of interference with daily functioning; “minimal,” “mild,” “moderate,” or “severe”) of the behavior. A key strength is that anchoring text accompanies, and is specific to each behavioral item, and provides specific information to guide a reliable rating (e.g., number of instances of the behavior; duration; degree of interference or distress). See Fig. 1.
In addition to specific behavior domains, iBehavior also includes situational questions, which are always presented before ratings are made. The situational questions include the total time (in hours) the user observed their child, the primary location of observation, whether the child was physically sick, and two questions regarding the child’s quality of sleep the night prior.
Participants. Participants included 10 parents of children with IDD. Informed consent was obtained from all individual participants included in the study. All users identified as biological mothers, and nine were married. All parents graduated from high school, and seven earned a bachelor’s degree or higher. The children (2 females, 8 males) were between the ages of 8 to 17 years and were diagnosed with Down syndrome (DS; n = 3) and fragile X syndrome (FXS; n = 7). All children had intellectual disability, with nine previously documented by our study team using the Stanford-Binet 5th Edition (SB5), with full scale IQ deviation scores ranging from 31 to 68 (m = 57.1, sd = 14.1) and significant delays in adaptive behavior.
Procedure. To begin, caregivers received one-on-one iBehavior training with a trained clinician via videoconference. Training lasted approximately one hour and consisted of three components. First, parents were guided to install the app, and set up their child as a participant. Next, parents were trained on the structure of the app, how eEMAs are conducted, and how to approach frequency and intensity ratings of behaviors. Parent users also specified two times they preferred to receive text message reminders to begin and end daily observations (generally shortly after waking up and late in the evening after all contact with their child). Finally, parents participated in a calibration interview, in which behaviors across each domain were explored. Parents provided examples of their child’s behaviors they thought aligned with behaviors presented in the app, and were encouraged to ask questions and elaborate on their responses. If parents identified behaviors that did not align with those in the domain, the trainer redirected interpretation of their child’s behavior and discussed specific areas of misalignment and scoring. See Fig. 2.
After training, parents completed 14 days of iBehavior observational ratings in the following domains: (1) Irritable Mood and Aggression, (2) Avoidant, Fearful and Nervous Behavior, (3) Stereotyped and Repetitive Behaviors and Interests, and (4) Social Initiation and Responsiveness. The Inattentive Behavior and Hyperactivity and Impulsivity domains were not rated, as newly funded projects have incorporated these two domains into a larger executive function-related behavior domain, with studies forthcoming.
During participants’ observation period, a study team member inspected the database at pre-selected time points (day 1, day 5, day 7, day 10, day 14) and monitored data logging. If a data point was missing, the day was assumed to be skipped, and the participant was contacted via e-mail or phone to encourage them to resume observation. If ratings were skipped, study personnel asked participants to compete additional days to obtain 14 complete ratings from each participant. At the conclusion of the observation, participants were asked to complete a set of validation measures, and a user feedback questionnaire.
Measures. The Aberrant Behavior Checklist - Community (ABC-C) 6 is a global behavior checklist used to measure maladaptive behaviors among individuals with IDDs. The ABC-C consists of 58 items that target five behavioral dimensions (irritability, hyperactivity, lethargy/withdrawal, stereotypy, and inappropriate speech). Participants were instructed to complete the ABC-C based on behavior observed during the 2-weeks of iBehavior.
The Behavior Rating Inventory of Executive Function 2 (BRIEF-2)27 is a 63-item proxy report of a child’s executive function and includes inhibition, shifting, emotional control, working memory, and planning and organization. Participants were instructed to complete the BRIEF-2 based on behavior observed during the 2-week period of iBehavior.
The Conners 3 Short-Form10 is an assessment of ADHD for children and adolescents ages 6 to 18 years. This 45-item assessment consists of the following subscales: inattention, hyperactivity/impulsivity, learning problems, executive functioning, aggression, and peer relations. Participants were instructed to complete the Connors 3 based on behavior observed during the 2-week observation period of iBehavior.
User Feedback. All participants completed a 15-question user feedback survey that included questions about ease of use, technological problems, preference of iBehavior to other traditional questionnaires, as well as other aspects of usability, relevance, and satisfaction.