The modified Delphi process(20) was conducted over a nine-month period beginning in May 2020 and ending in January 2021 (Figure 2).
Participants
Study team members employed purposive sampling(21) to identify an expert panel including members representing early childhood home visiting researchers, allied experts from home visiting models, and leaders in THV development and implementation (21). Participants came from diverse geographic areas of the US, and represented several sectors of the home visiting field including model developers, implementers, and researchers. At the end of March 2020, the study team emailed an invitation to participate to 22 home visiting experts. Three potential panel members were unable to participate due to other commitments. Three model developer representatives never responded to the invitation. All invited leaders in THV agreed to participate in the study. Ultimately, there were 16 members of the expert panel, each of them participating in some capacity during all three rounds of the modified Delphi process. Overall, 63% of the panel members were researchers (n=10), 43% were model representatives (n=7), and 43% were tribal stakeholders (n=7).
Procedures
Our approach was guided by the Delphi method, which was developed by the RAND Corporation in the mid-20th century as a way for researchers to reliably build consensus of a group of experts (22,23). Because this study took place during the COVID-19 pandemic, we modified the Delphi approach with the three rounds of questionnaires and discussion carried out via a series of Qualtrics surveys (24) and Zoom video conferences (25), coupled with email communication as necessary. Since the expert panel was only asked questions about home visiting in general and no personal information was solicited, this process was exempt from oversight by the Institutional Review Board (26).
At the start of the expert elicitation process, participants were divided into two panels: one for researchers and model developers, and one for leaders in THV. As such, two project launch meetings were conducted via Zoom video conference in May 2020 to provide expert panel members with an overview of the project prior to launching into Round 1 of the modified Delphi process.
Round 1
Both expert panels received the same survey. In this survey, free lists were used to elicit SPEs in EBHV. Free listing was selected as a feasible approach that would preserve open-ended inquiry. Participants were provided with a definition of SPEs - “the techniques and strategies used in early childhood home visiting as part of a larger intervention” - and were asked to list out all elements they considered important in early childhood home visiting. There was room to enter up to 20 individual SPEs, in addition to an open-text field for additional entries. Participants were then asked to identify which free-listed SPEs are critical to home visiting programs that serve tribal communities. Lastly, participants were asked to list any additional SPEs specific to THV but not already mentioned previously. There was space for up to five individual tribal SPEs, in addition to an open-text field for additional entries. Data collection lasted for 3 weeks, concluding when the target sample size (n = 16) was reached. To view the full first survey, refer to Additional File 1.
Analysis of Round 1
Responses to the first survey were exported and combined into one Excel file with two tabs labeled “general elements” and “tribal elements.” Each tab in the Excel file had four columns: element (i.e., standard practice element label), frequency (i.e., count of respondents who free-listed the same or similar element), respondent initials (so we could follow-up if needed), and other similar descriptors (so we could see how each element was coded). Study team members read through each respondent’s free-listed SPEs and decided whether each was distinct or could be combined with another element already listed in the Excel file. For example, the element “linkage to services” was listed by 12 respondents using different descriptors such as “connecting families to resources,” “making a referral to appropriate community service,” and “resource connection,” among others. If it was unclear that listed elements were referring to the same thing, the study team kept them separate.
Two additional video conferences were held with each of the expert panels in June 2020 to further clarify and refine the list generated through the initial study. Study team members guided these group discussions in order to identify commonalities, clarify meaning, and solicit additional thoughts. In addition, during these meetings each panel independently concluded that there needed to be a broader level of categorization of SPEs to capture the multi-level hierarchy of home visiting (e.g., what are elements delivered during the visit, what are elements for home visitors training and background, etc.). As a result, via email feedback, panel members were asked to build consensus on broad category names and definitions to be used in future Qualtrics surveys.
Round 2
For Round 2, the expert panels were combined, and they remained combined through Round 3. Thus, all email communication, subsequent surveys, and video conferences were held with the entire group of experts. In September 2020, expert panel members were asked to complete a second Qualtrics survey. In this survey, all previously elicited SPEs were presented to participants, and they were asked to group these into the broad categories defined in Round 1. Because each SPE was listed as a multi-select question type, they were able to be grouped into multiple broad categories. At the end of the survey, participants had the option to add additional SPEs that were not already included in the master list. Data collection lasted for 3 weeks, and responses were collected for n=14 expert panel members. The full survey is available in Additional File 2.
The second phase of Round 2 included a second video conference held with all expert panel members at the beginning of October 2020. The study team presented the results of the second survey and asked follow-up questions to further collapse or clarify SPEs. This included re-wording of some elements and the addition of elements to better capture specific practice activities. These additions were done through consensus processes across panel members. Due to several elements that continued to need clarity, expert panel members were asked to complete a third, brief Qualtrics survey to help finalize the list of SPEs. Participants were also asked for further input (i.e., “Is this definitely a standard practice element in home visiting?”) on free-listed elements that were initially provided by only one panel member in the first Qualtrics survey. Data collection lasted for a little over 1 week, concluding when the target sample size (n=16) was reached. To view the full survey, refer to Additional File 3.
Data analysis for Round 2
Responses to the first survey in Round 2 were exported, and a study team member tallied the number of responses for each SPE under each broad category. The full study team then met to make decisions about final SPE-broad category matching. A rule was created that for any SPEs that were added to a broad category by the majority of respondents (greater than 50%), it would be considered assigned to that category. For example, the SPE “teaching goal setting skills to parents” was voted by nine expert panel members as belonging to the broad category “home visiting content.” Since that is more than half of the 14 total respondents to that survey, it was coded in that category for the final results. Ties were discussed as a study team. Responses for the second survey in Round 2 were similarly exported and analyzed with SPEs only included in the round 2 list if greater than 50% of respondents endorsed their inclusion.
Round 3
To align with other efforts in the PHV field (16,17), and to keep consistent with several home visiting program approaches aimed at changing parental behavior, at the beginning of this final round of the modified Delphi process, study team members engaged in an internal process to match expert-generated SPEs to the Behavior Change Technique Taxonomy.(27) The Behavior Change Technique Taxonomy was developed by researchers in the United Kingdom through a Delphi process in order to provide structure for reporting on behavior change interventions. Behavior change techniques (BCT) are strategies that help an individual change their behavior to help achieve better health. Identifying and understanding the mechanisms of BCTs are the current focus of most PHV research. Our approach to defining SPEs was intentionally broader – focusing not just on individual change, but techniques and strategies used broadly in early childhood home visiting programs to effect change. First, a full list of SPEs and BCTs was created, along with definitions and examples for each item in the list. The Behavior Change Technique Taxonomy had published definitions and examples, but the study team created definitions and examples for the panel-generated SPEs based on our panel discussion notes and from experience in the home visiting field.
The list of SPEs and BCTs was input into a Qualtrics survey using the “Pick, Group, and Rank” question type to facilitate independent coding between two internal coders from the study team. Each coder independently filled out the survey by dragging and dropping panel-generated SPEs into matching BCTs. The two coders and Principal Investigator discussed the results of the survey and came to consensus about any disagreements in the coding. Afterward, a final list of all SPEs and BCTs was created.
For the final Round 3 of Delphi surveys, panel members completed a Qualtrics survey to prioritize SPEs and BCTs to outcome domains in evidence-based home visiting (Additional File 4). Outcome domains selected for this project were adapted from the Home Visiting Evidence of Effectiveness review and the Pew Home Visiting Data for Performance Initiative(28,29) with two additional tribal outcome domains that were of interest to the study team. Domains included: 1) promotion of healthy physical child development (e.g., healthy eating, breastfeeding); 2) promotion of social-emotional learning; 3) improving cognitive development (e.g., language development); 4) linkages and coordination of referrals for other community resources and supports; 5) reductions in maternal distress (e.g., depression, anxiety, stress); 6) reductions in substance use; 7) promotion of positive parenting practices; and 8) reductions in child maltreatment. Domains specific to THV included: 9) reductions in tribally-related health disparities (e.g., Type 2 Diabetes, Mental Health); and 10) promotion of connection to culture. Because the list of SPEs and BCTs was so long, each expert panel member was randomly assigned to complete the matching survey for only four outcome domains. Ratings were chosen based on methods used in a previous study by McLeod et al.(30) and included: 0 = “Not necessary,” 1 = “Useful, but not essential,” and 2 = “Essential” to achieve the outcome. Expert panel members rated each SPE and BCTs according to their importance in achieving the outcome domain. Data collection lasted for 7 weeks, concluding when the target sample size (n=16) was reached. To view the full final survey, refer to Additional File 5.
Data analysis for Round 3
Data were exported and combined in Excel for each outcome domain. For each element, frequency of ratings (e.g., Not necessary, Useful, Essential) and average ratings were tallied for each outcome domain. Average ratings were used as an indicator for strength of the relationship between the element and the outcome which were analyzed as heatmaps to help with visualization of the results.
Final taxonomy
Final taxonomies of SPEs and BCTs were developed for each outcome domain. This was done by retaining any SPE or BCT that was rated as “Essential” to that outcome domain by 50% or more of respondents.