Setting
Philadelphia, a city of over 1.5 million residents, is the poorest of the United States’ 10 largest cities (26% of residents live below the poverty level) (21, 22). The city’s population is 41% African-American, 35% Non-Hispanic White, 15% Hispanic, 8% Asian, and 2% other race (22, 23). Public behavioral health services (i.e., mental health and substance use treatment) in Philadelphia are financially supported by Medicaid and managed by Community Behavioral Health (CBH), a non-profit managed care organization (i.e., “carve-out”) established by the City that functions as a component of the Department of Behavioral Health and Intellectual disAbility Services (DBHIDS). In 2018, DBHIDS and CBH included 175 in-network provider organizations serving 118,011 unique members (24).
Since 2007, DBHIDS has supported EBP delivery in Philadelphia through a series of “EBP initiatives” that include training, expert consultation, and implementation supports (e.g., booster trainings, implementation meetings) for participating clinicians (25). These initiatives supported implementation of several cognitive behavioral therapy models including cognitive therapy, prolonged exposure, trauma-focused cognitive-behavioral therapy, dialectical behavior therapy, and parent child interaction therapy for a range of psychiatric disorders. In 2013, DBHIDS created a centralized infrastructure called the Evidence-based Practice and Innovation Center (EPIC) to oversee EBP implementation efforts. EPIC supports EBP implementation by coordinating and consulting EBP efforts across the clinics within the CBH network (the managed care organization), contracting with treatment experts to deliver EBP training, contracting with treatment providers to deliver EBP, providing EBP consultation and implementation support, hosting events to publicize EBP delivery, maintaining web-based resources (e.g., webinars), designating EBP programs within provider agencies, and providing financial incentives (e.g., enhanced rates) for delivery of EBPs.
Participants
The target population for this study was clinicians, supervisors, and administrators employed within clinics that deliver publicly-funded behavioral health services in the City of Philadelphia. The sample did not include members of EPIC (i.e., it did not include treatment experts or consultants). Because DBHIDS does not maintain a roster of email addresses to directly contact active clinicians, we used a two-pronged recruitment and sampling approach. We sent invitation emails to leaders of behavioral health organizations (n = 210), clinicians (n = 527), and other community stakeholders (e.g., directors of a clinician training organization; n = 6) in Philadelphia. We also e-mailed the invitation to four local electronic mailing lists known to reach large swaths of the CBH network (e.g., managed care organization listserv) and asked organization and network leaders to forward the email. From these contacts, the survey link was opened 654 times and 343 respondents completed the BWS choice experiment.
Study Design and Procedure
The BWS choice experiment was designed to quantify stakeholders’ preferences for 14 implementation strategies developed through iterative elicitation, pilot, and pre-testing work completed with members of each stakeholder group in the target population (17, 26). Elicitation of strategies was completed via a system-wide innovation tournament, described elsewhere (27), through which clinicians submitted ideas for strategies to support EBP implementation in Philadelphia. Following the tournament, submitted ideas (N = 65) were analyzed and refined by a team of implementation scientists, behavioral scientists, and clinicians, in order to develop a set of distinct, clearly operationalized implementation strategies with ecological validity for the target system. The analysis process involved combining similar strategies, crafting definitions of each strategy, and ensuring that all strategies were adequately captured by the final set. This process resulted in a set of 14 implementation strategies (see Table 1: List of Implementation Strategies Included in the BWS Experiment), which were subsequently evaluated in pre-testing interviews with clinicians, supervisors, and administrators (n = 9) within the system to ensure that the strategies, as described, spanned the full range of approaches viewed as relevant by stakeholders and were clearly described. The 14 strategies fell into six categories: (1) financial incentives, (2) clinical consultation, (3) clinical support tools, (4) clinician social support and networking, (5) clinician performance feedback/social comparison, and (6) client supports (27). Notably, the strategies developed through this process addressed 8 out of 9 categories of implementation strategies identified in the Expert Recommendations for Implementing Change (ERIC) project (18), including: use evaluative and iterative strategies, provide interactive assistance, develop stakeholder interrelationships, train and educate stakeholders, support clinicians, engage consumers, utilize financial incentives, and change infrastructure. Supplemental Table 1A in Additional File 2 shows how the strategies from the present study aligned with the discrete implementation strategies identified by the ERIC project.
Table 1. List of Implementation Strategies Included in the BWS Choice Experiment
Category
|
Strategy Name
|
Definition
|
Financial Incentives
|
EBP certification bonus
|
Receipt of a 1-time bonus for verified completion of a certification process over a 1-year period, in which clinicians: attend four, 1-day booster training sessions; pass a multiple-choice knowledge test; and submit one tape of a session with a client where they use the EBP.
|
Compensation for use of EBP per session
|
Receipt of additional compensation (in addition to regular paycheck) upon verification of using the EBP in sessions with clients for whom it is appropriate (i.e., per session), up to a specified amount per year.
|
Compensated time for preparation
|
Ability to bill for any verified time clinicians spend preparing to use the EBP (e.g., reviewing the protocol, preparing materials for session, reviewing client homework, etc.), up to a specified amount per year.
|
Clinical Consultation
|
Expert-led EBP consultation
|
1-hour, monthly, web- or phone-based consultation, with up to 5 other clinicians, for one year led by an expert EBP trainer.
|
Peer-led EBP consultation
|
1-hour, monthly web- or phone-based conference, with up to 5 other clinicians, for one year led by a clinician with experience implementing the EBP in Philadelphia.
|
Expert in your back pocket (on call)
|
Network of EBP trainers on call via phone or web chat for same-day, 15-minute consultations to problem-solve issues with implementing the EBP.
|
Clinical Support Tools
|
Web-based resource center/ mobile app
|
Includes: (a) video examples of how to use specific techniques for the EBP, (b) “session checklists” with steps outlined for using the EBP techniques in session, and (c) downloadable worksheets and measures needed to use the EBP.
|
Electronic evidence-based screening instrument inventory
|
Evidence-based screening instruments included in an electronic medical record, completed electronically by clients in the waiting room (e.g., tablet); results are automatically scored and immediately available so clinicians can assess treatment needs and track client progress.
|
Clinician Social Support and Networking
|
EBP-focused online forum
|
Confidential site available only to registered clinicians who use the EBP, where clinicians can login and post questions and answers about using the EBP, share tips, and identify resources for using the EBP.
|
Community-based EBP mentoring program
|
One-on-one mentoring program, where clinicians are matched with a local peer clinician who works with the same population to support each other in implementing the EBP.
|
Performance Feedback / Social Comparison
|
EBP Performance benchmark leaderboard
|
Posted where only agency staff can view it, recognizing clinicians in the agency who met a benchmark for EBP implementation each month (based on 3 randomly selected sessions).
|
EBP Performance benchmark email
|
Available only to the clinician and his/her supervisor, reporting whether s/he met a benchmark for EBP implementation each month (based on 3
randomly selected sessions).
|
Client Supports
|
Client mobile app/ texting service
|
Provides clients with reminders to attend sessions, prompts to complete homework assignments, and clinician-tailored messages about practicing EBP skills.
|
Improved waiting room
|
Create a relaxing waiting room (e.g., physical appearance, sensory experience) that helps prepare the client to enter the session ready to work on EBP content.
|
Because each of the 14 strategies represented a qualitatively unique strategy, we used object case BWS (as opposed to profile case or multi-profile case BWS) (28). The BWS experimental design was generated using the Sawtooth Discover algorithm which produces randomized choice sets with optimal frequency balance, orthogonality, positional balance, and connectivity for a given sample size (29-33). Within the design, each participant was shown 11 sets of four randomly selected and randomly ordered strategies and, within each set, asked to choose which strategy was “Most useful” (i.e., best) for supporting clinicians’ implementation of psychosocial EBPs and which strategy was “Least useful” (i.e., worst). The Discover algorithm optimizes 1-way, 2-way, and positional balance within the randomization sequence such (a) that each strategy is presented an equal number of times, (b) each pair of strategies appears in a set an equal number of times, and (c) each strategy is shown in each position an equal number of times. For this study, each strategy was included in at least three sets. Participants were instructed to imagine that their organization had decided to adopt a new psychosocial EBP that exhibited excellent outcomes for their specific client population, and that this treatment was new to the respondent (or to clinicians working in the respondent’s setting; see Additional File 1 for the BWS prompt and an example set of strategies). The prompt explained that initial training in the EBP would be provided and would include active learning approaches, and their input was sought regarding the best implementation strategies that could be used to support clinicians’ implementation of the new practice following training.
Sample size calculations assumed an alpha level of .05, margin of error of 0.1, and 14 implementation strategies to be rated with each strategy appearing in a minimum of 3 sets. Based on these assumptions, the required sample size was N=244 participants rating 11 sets of 4 strategies each (28, 34).
The BWS experiment was implemented via a web-based computerized survey emailed to clinicians, supervisors, and administrators from March 2019 to April 2019. Consistent with best practices in survey administration, we utilized a process (35) in which participants received a pre-survey priming email, survey invitation email, and three follow-up reminders, delivered approximately one week apart. Surveys took approximately 30 minutes and participants received a $25 gift card.
Measures
In addition to completing the BWS questions, respondents reported on professional and workplace characteristics: primary role (administrator [those who were executive level administrators within the clinics], supervisor [those who supervise clinicians in clinical work], clinician [those who primarily offer direct services to clients]), education level, type of clinic in which they were employed (mental health, substance use, dual diagnosis), salary versus fee-for-service employment, tenure in current agency, years of experience as a clinician, extent to which their graduate training emphasized EBP (ranging from 1=Never to 7=Always), average hours worked per week, number of City-sponsored EBP training initiatives in which the respondent had participated (ranging from 0 to 6), number of BWS strategies currently in use by their employing agency (ranging from 0 to 14), age, sex, race, and ethnicity. Because of heterogeneity across roles, administrators and supervisors did not report on salary versus fee-for-service employment, hours worked per week, extent to which their graduate training emphasized EBP, years of experience as a clinician, or number of City-sponsored EBP training initiatives participated in.
Data Analysis
Best and worst choice frequencies for each strategy were summarized at the sample level using count analysis which represents the proportion of times a strategy was chosen as most or least useful relative to the number of times it was displayed (17). Preference weights for each strategy were calculated at the individual level using hierarchical Bayes estimation with a multinomial logit model implemented using CBC/HB software from Sawtooth (version 5) (36-40). Latent class analysis (LCA) (19, 20, 41) was used to identify segments of the population with different preferences and to estimate preference weights (i.e., part worth utilities) for each segment using Sawtooth Software’s LCA program (version 4.7), which implements the estimation procedure described by DeSarbo and colleagues (19). We estimated LCA models with 1 through 5 classes. Consistent with best practices, we selected the best-fitting model on the basis of the Bayesian information criterion (42), probabilities of correct classification (43), sufficiently populated classes, and interpretability of classes based on alignment with previous research and theoretical considerations (44). Differences across segments on professional characteristics were tested using analyses of variance and chi-square tests (SPSS, Version 25). There were no missing data on participants’ preferences. Because very few participants (<5%) had missing data on professional and sociodemographic variables, these were excluded from analyses on a pairwise basis.