The Next Challenge in Implementation Science: Transferring Implementation Strategy Knowledge and Skills

Background: Complex evidence-based clinical innovations are challenging to implement, with many settings requiring implementation assistance to increase likelihood of innovation uptake with delity. Implementation science researchers are identifying evidence-based implementation strategies that support the uptake of evidence-based practices and other clinical innovations. However, there is limited information regarding how to develop methods to educate implementation practitioners on the use of implementation strategies and how to sustain these competencies over time. We developed, initiated, and evaluated an implementation training program for both implementation researchers and practitioners. Methods: Participants who attended 7 trainings were asked to complete an electronic survey 2 weeks before training, immediately post-training, and 6 months post-training. We assessed participant knowledge and condence in applying implementation facilitation skills using a 4-point Likert scale. Scores were compared at baseline to those provided post-training and at 6 months, as well as post-training to 6 months post-training (using nonparametric Wilcoxon Signed Rank tests). Results: One hundred and two participants (76 in-person, 26 virtually) completed the pre-and post-training evaluation. Participants reported a signicant increase in perceived knowledge and condence across all implementation facilitation domains from pre- to post-training and pre- to 6-month training follow up. There was no signicant difference between virtual and in-person participants. When comparing post-training to 6-months, increases in perceptions of knowledge remained unchanged though half of the domains showed signicantly lower perceived condence to apply facilitation skills. Conclusions: Our ndings indicate that we have developed and conducted an effective Implementation Facilitation Program with participants reporting signicant increased knowledge and condence applying implementation facilitation skills. These results were sustained among trainees at 6 months post-training, as participants reported perceived improvements in knowledge and condence in applying implementation facilitation skills in all domains. Additionally, knowledge and condence change scores did not differ based on in-person versus virtual attendance, suggesting that implementation training can successfully be provided to a remote site without loss of knowledge/skills transfer. The decrease in perceived condence in applying skills from post-training to 6-months post training in some areas indicate a need for ongoing support of facilitation practitioners.

Our ndings call for new facilitators to receive timely training and ongoing mentoring for continued learning and professional development in the core competencies required to serve as an implementation facilitator.

Background
The need for training programs to transfer implementation knowledge and skills Complex evidence-based clinical innovations are challenging to implement, with many settings requiring implementation assistance to increase likelihood of innovation uptake with delity. With implementation science emerging as an established eld of scienti c inquiry, researchers are identifying evidence-based implementation strategies that support the uptake of evidence-based practices (EBPs) and other clinical innovations [1,2]. Yet, little is known about how best to transfer the requisite knowledge and skills to effectively apply evidence-based implementation strategies to those who can use them in non-research initiatives.
There have been many calls to develop infrastructure to ensure that gains made in advancing implementation science knowledge are effectively transferred to clinical and operational healthcare leaders [3][4][5]. Proctor and colleagues [6] engaged healthcare leaders, funders, and researchers to identify areas needed to support sustainability of evidence-based healthcare innovations. One of the eleven domains identi ed for increased attention was capacity development or training the healthcare workforce. These leaders called upon foundations, universities, and professional associations to better train practice leaders and frontline providers in evidence-based strategies that introduce, implement, and sustain advances in healthcare. Thus, there is a need to develop rigorous and tested methods to educate implementation practitioners on the use of evidence-based implementation strategies, thereby ensuring more rapid transfer in putting implementation science knowledge into the hands of those who can apply it more broadly to enhance clinical and public health impact [7]. In addition, once skills are obtained by implementation practitioners, even less is known about effective approaches for helping them sustain these competencies over time.

Introduction to implementation facilitation
Evidence supporting implementation facilitation (IF) as an effective implementation strategy is expanding, and an increasing number of projects are using IF strategies to implement clinical innovations [8][9][10][11][12][13][14][15][16][17][18][19]. IF is a multifaceted strategy involving a process of interactive problem-solving and support that occurs in a context of a recognized need for improvement and supportive interpersonal relationships [20]. IF typically bundles an integrated set of activities and strategies to support uptake of effective practices, including but not limited to: engaging stakeholders, identifying local change agents, goal-setting, action planning, staff training, problem-solving, providing technical support, audit/feedback, and marketing [21][22][23][24]. The activities that facilitators apply and when they apply them depends upon facilities' needs over the course of the implementation process [23,25,26]. To apply all of these activities in a way that is responsive to a setting's needs over time, facilitators require a high level of exibility, communication, and interpersonal skills [23,27].
Facilitation as an evidence-based strategy While our application of IF is guided by the integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework, multiple conceptual frameworks and planned action models accommodate the use of IF as a primary implementation strategy [10,28]. The i-PARIHS framework proposes that successful implementation of EBPs and other clinical innovations is the result of facilitation of an innovation in partnership with recipients, considering characteristics of the innovation itself and the recipients' inner and outer contexts. Successful implementation is de ned as achievement of agreed-upon goals, uptake and institutionalization of the innovation, engaged stakeholders who 'own' the innovation, and minimization of variation related to context across implementation settings [29].
Among i-PARIHS constructs, facilitation is the active ingredient, with facilitators supporting implementation by assessing and responding to characteristics of the recipients of the clinical innovation within the context of their own settings. An increasing volume of literature supports IF as an evidencebased strategy for implementing EBPs and other clinical innovations [8- 10,14,19]. Systematic reviews have found that primary care practices were almost three times more likely to adopt evidence-based guidelines through the use of facilitation [18], and multiple studies conducted across diverse clinical settings have shown IF to be an effective strategy for implementing new clinical programs and practices [8, 10-15, 17, 19].
Need for training programs and evaluation of them Training programs and workshops are common ways to convey knowledge to participants about new professional competencies [30]. When developing these training programs and workshops, it is important to evaluate their overall impact and the speci c components that are effective. Therefore, a robust evaluation plan needs to be created in conjunction with the creation of new training programs. Multiple methods exist for assessing classroom learning, including written examinations, direct observation, supervisor assessments, clinical simulations, and multisource assessments [31]. These classroom-based assessments can also be used to assess learning outside of typical student training programs, such as in professional development workshops [31]. Historically, training and workshop evaluations focus on understanding changes in participant behavior as a result of attendance and typically show small effects [32]. However, educational researchers are increasingly evaluating other variables, including how trainings impact participants, why they impact participants, and understanding other contextual factors that may contribute to the changes [32]. Further, evaluation of core skills, with follow-up across time, to assess not only acquisition but also sustainment can substantially enhance training program evaluation. Ideally, training program evaluation should be multifactorial, include both a pre-and post-assessment, and be followed by additional evaluation at a later point in time.
An increase in empirical support for IF has resulted in a demand to ensure there is an available, highly skilled workforce with the core competencies required to successfully serve as an implementation facilitator. Unfortunately, there are few individuals who have the necessary training, skills, and knowledge to effectively apply IF to provide evidence-based implementation support. Thus, there is a need to ensure new facilitators receive timely training and ongoing mentoring for continued learning and professional development.
To address this gap, we developed, initiated, and evaluated an IF training program for both implementation researchers and practitioners. Below, we describe the development of the training program, its components, and outcomes from an independent evaluation of a subset of those who have participated in the training.

Initial development of the Implementation Facilitation Training Program
In May 2011, based on early success in a Type III Hybrid implementation facilitation study [8] and before data collection was completed, national clinical and operational leadership in the Department of Veterans Affairs (VA) expressed interest in using IF to support implementation of VA's Uniform Mental Health Services Handbook, which established clinical standards for VA mental health treatment settings [33]. At the time, these leaders wanted to build capacity at the national level for facilitating implementation of this Handbook by training and supporting the work of their technical assistance staff. Leaders understood the intricacy of the IF strategy, the value of the activities incorporated within it, as well as its exibility to leverage complementary strategies/tools to support EBP implementation (particularly for complex clinical innovations).
In response to this request, we began collaborative discussions with them about ways to meet their needs, resources they could contribute to the effort, and future partnership activities. Literature and our own experiences suggested that our partners needed information on IF that was relevant, timely, and written in nonacademic language [34,35]. Subsequently, we developed an IF Training Manual to serve as source material for a training workshop on IF and as a reference for those receiving ongoing mentoring in IF [36]. Because the IF strategy was developed, applied, and tested within the context of a clinical initiative (i.e., integrating mental health services into primary care settings), we were careful to use language in drafting the manual that was nontechnical, straightforward, and appropriate for clinical practitioners rather than more technical research terminology. To ensure its usefulness, we asked other IF experts to review drafts of the manual and provide feedback to inform revisions. Finally, we elicited and incorporated feedback from a review by our national partners (end-users of the Training Manual). Thus, we developed an IF Training Manual informed by research, theory, application, and stakeholders at all levels of the healthcare system. Prevention interested in learning about and/or applying IF strategies). To address this need, we developed an ongoing IF training program to, in particular but not exclusively, address IF training needs for speci c high priority VA clinical initiatives. In addition to these dedicated trainings focused on high priority initiatives, we conducted more 'general' IF trainings comprised of researchers and clinical practitioners from a mixture of clinical and research initiatives. In 2018 we trained our rst group of participants using a virtual platform. This was limited to up to ve participants associated with a single implementation initiative. The single virtual site was paired with another group of in-person trainees, with in person attendance limited to 15 participants. Subsequent trainings allowed an increase of up to 10 virtual participants while ensuring the total number in a training cohort did not exceed 25 (virtual and in person combined).
Also in 2018, realizing the critical need for comprehensive transfer of implementation science knowledge, the national VA QUERI O ce released a competitive Call for Proposals to provide pilot funds to develop and evaluate Implementation Strategy Training Hubs. These pilot funds allowed our IF Training Program to engage an education consultant to provide input on adult learning best practices and conduct and incorporate ndings from an independent evaluation of the Program (described below). In late 2018, we applied for and received 3-year funding to become a formal QUERI Training Hub, focusing on IF.
Since 2019 our IF Training program has been approved for 16 hours of Continuing Education Credit through VA's Employee Education System for the following disciplines: Accreditation Council for Continuing Medical Education, American Nurses Credentialing Center, American College of Healthcare Executives, Accreditation Council for Pharmacy Education, Association of Social Work Boards, American Psychological Association and National Board for Certi ed Counselors. See Fig. 1 for a timeline of key events associated with the development of the IF Training Program.

Training format
Our IF Training Program is a 2-day training that targets implementation practitioners (clinical and operational managers who are conducting implementation efforts), though it is also often attended by implementation science researchers and clinical/operational leaders overseeing implementation of key initiatives. Three implementation facilitation experts, each with over 15 years of experience in the eld, serve as trainers based on their extensive skills, knowledge, signi cant involvement in the creation of the training materials and course curriculum. The curriculum and training approach are multifaceted, comprised of preparatory work, didactic sessions, interactive role-play exercises, and opportunities for ongoing mentoring.
Preparatory work for participants includes a review of the IF Training Manual and development of a draft Implementation Planning Guide [37] that is tailored to the clinical innovation the participant is implementing. In addition, we developed a 35-minute video that participants watch prior to attending the training. This video provides an overview of the i-PARIHS framework with a description of its domains (innovation, context and recipients), a description of facilitation, and the literature that supports facilitation as an evidence-based implementation strategy. The video also stresses the broader application of facilitation to conceptual models and change strategies beyond i-PARIHS [10,20,28,38]. During the 2-day IF training, we review the information provided in the video through a reverse classroom experience in which attendees describe potential barriers and facilitators in each of the i-PARIHS domains that may occur in their own projects.
Didactic training without hands-on skill practice has minimal effects on behavior or patient outcomes [39,40]. Training must include learner demonstration of new skills to maximize the likelihood that posttraining practice will change [41]. Based on feedback following early training sessions, and consistent with prior research, we realized that interactive role plays and group exercises in applying IF skills were a critical process through which participants could practice what was taught during the didactic sessions [42][43][44]. Role plays are regularly used to teach participants new skills [45] and are increasingly being studied to understand their impact on learning. A Cochrane literature review found that teaching efforts that included both didactic and interactive teaching activities, such as role plays, were more effective at increasing knowledge than either strategy alone [30]. Further, students nd role plays helpful in their learning [45]. We provide a series of interactive role play exercises throughout the IF Training Program, giving trainees opportunities to apply facilitation skills through scenarios based on common situations that program faculty have experienced when serving as facilitators. In these exercises, a trainee volunteers to interact with a member of the program faculty who acts out an experience they have had with a stakeholder in the past (playing the role of the stakeholder), and the other program faculty engage with the volunteer and other trainees to brainstorm ideas/suggestions for how to apply facilitation skills to effectively address the scenario that is being presented.
While the formal training provides a baseline experience, the full acquisition of skills is accomplished through applying the knowledge learned in the training as a facilitator. These experiences can present challenging situations even for expert facilitators. Thus, ongoing opportunities to present and discuss complex facilitation issues in real time is critical. We have two mechanisms to ensure that participants have an opportunity for feedback during the conduct of their facilitation activities. First, IF trainers established monthly virtual o ce hours during which at least two of the trainers are available for consultation. Email reminders of this opportunity are sent out two days prior to the monthly call along with dial-in information for all who have received the 2-day training. These calls have been well received by participants, with 3-10 participating each month. In fact, several times a participant will not have a speci c question, but rather will note that they are attending to capture "pearls" from trainers' responses to questions from others. Second, an additional resource for participants is our Implementation Facilitation Learning Collaborative, established in 2014 as part of the overarching Behavioral Health QUERI Program. This Learning Collaborative meets monthly and is currently composed of over 150 operational, programmatic, and clinical leaders as well as VA and non-VA researchers interested in applying and studying IF. The goal of the IF Learning Collaborative is to provide a platform for participants to share best practices in IF and work collaboratively to advance the science and practice of IF. It also provides opportunities for ongoing consultation provided by a broad collection of implementation scientists and practitioners.

Training content
Training topics include: an overview of the i-PARIHS framework and evidence for IF ( rst presented through a video and discussed in a reverse classroom experience as described above); core facilitation competencies; distinctions between external/internal facilitators and clinical champions; facilitation roles and activities in the pre-implementation, implementation, and sustainability phases (described in Table 1); best practices in applying virtual facilitation; and tools and processes for documenting IF activities. An overview of the didactic sessions provided during the 2-day training are presented in Table 2. This phase typically involves a facilitator's initial engagement with site stakeholders over a period of weeks or months to gain a better understanding of their resources, priorities, and context, with the primary objective of assisting in the development of an implementation plan. Implementation Phase: Once the implementation plan has been developed, the Implementation Phase typically starts with a kick-off meeting where the implementation plan is carried out with ongoing, relatively frequent contact by the facilitator to monitor and support the plan's execution and re nement. Sustainment Phase: During this phase, contact between site stakeholders is less frequent and eventually non-existent, with the objective being to turn a mature (hopefully successful) implementation effort over to the site with minimal further interaction with the facilitator.

Methods
From May 2017 to July 2019, participants who attended seven trainings completed an electronic survey approximately 2 weeks before and 2 weeks after each training. Pre-training surveys collected participant demographic information (e.g., education, discipline) and assessed the following domains using rank items according to a 4-point Likert scale, in which 0 = Not At All, 1 = Somewhat, 2 = Moderately, and 3 = Extremely: (a) familiarity with implementation science frameworks and strategies, and (b) knowledge of and con dence in applying IF skills in the following domains: facilitating adoption of clinical innovations; roles and activities of external and internal facilitators as well as site champions; activities for facilitators during pre-implementation, implementation, and sustainability phases; applying methods to overcome barriers; facilitating stakeholder engagement; developing local implementation plans; using virtual facilitation; and evaluating the IF strategy. The post-training survey re-evaluated the above domains and assessed participants' perceptions of the training experience, including delivery of the material (e.g., level of interaction between participants and trainers) and logistics (e.g., meeting space and visuals). Participant knowledge of and con dence in applying IF skills were evaluated again via a survey 6 months post-training. Self-reported knowledge and con dence scores were compared at baseline (using nonparametric Wilcoxon Signed Rank tests) with values provided immediately post-training and at 6months. Using similar evaluation methods, we also compared self-reported knowledge and con dence assessments immediately post-training to those obtained at 6-months post-training.

Results
One hundred and two participants completed the pre-training evaluation and either the post-training evaluation (n = 97) or the 6-month follow up evaluation (n = 36). Thirty-one participants completed both post-training and six-month evaluation. Seventy-six of the participants were trained in-person and 26 were trained virtually.
At baseline, the majority of participants (64%) reported having doctoral degrees. When asked about current job duties (which were non-exclusive categories), the majority (79%) reported research functions, while a high percentage (42%) also reported program evaluation, education (42%), clinical responsibilities (37%), and administrative roles (32%). Prior to the training, 24% reported they were not at all familiar with implementation science frameworks, while 46% reported being somewhat familiar. Similarly, 27% reported being not at all familiar with the IF strategy, while 52% reported being somewhat familiar, 17% were moderately familiar, and only 5% reported being extremely familiar.
Post-training, 92% reported that the training met their expectations, and 97% reported they would be able to apply the knowledge gained. Indicating high satisfaction, over 98% reported that the materials were pertinent and useful, trainers were knowledgeable, participation was encouraged, and adequate time was allotted for content. Wilcoxon Signed-Rank tests revealed a signi cant increase in participants' perceived con dence and knowledge across all IF domains from pre-to-post training (Wilcoxon values and changes in each domain provided in Additional Files 1 and 2) and pre-to 6-month (see Tables 3 and 4); however, when comparing post-training perceptions to those at 6-months we found that while increases in perceptions of knowledge remained unchanged, perceived con dence to apply facilitation skills in half (6/12) of the IF domains was shown to be signi cantly lower (p < .05) at 6-months than it was at posttraining (Wilcoxon values and changes in each domain provided in Additional Files 3 and 4). These domains included facilitation activities in each of the three implementation phases, virtual facilitation, and overcoming barriers to change. When comparing pre-to-post changes in knowledge and con dence among the 26 trainees that participated virtually to those that participated in person, there were no signi cant differences at post-training nor at 6-month follow up (p > .05).  Number and percentage of responses regarding con dence per category for each Implementation Facilitation domain at pre-and 6 months post-training. Wilcoxon Signed-Rank test statistic (S) corresponds to the difference between the expected and the observed sum of signed ranks. P-value (p) is associated with Wilcoxon Signed-Rank test.

Discussion
To answer the call for formal research training in implementation science, several university [46] and funder-initiated [47][48][49][50] training programs have been developed. Yet we now face the challenge of educating our healthcare leaders, frontline providers, and implementation practitioners in implementation skills as well. As noted by Proctor and Chambers in a report of ndings from a meeting on dissemination and training needs sponsored by the National Institutes of Health, few US-based training programs focus on implementation practitioners or policy makers [51]. This manuscript describes the development, implementation, and evaluation of a highly partnered IF Training Program that addresses this gap. The IF Training Program, which was initiated to address healthcare leaders' needs within a single large healthcare system (VHA) [34], has developed into a robust program that continues to train growing numbers of implementation practitioners and researchers inside and outside of the healthcare system for which it was originally developed.
Findings from independent evaluation of the IF Training Program in surveys administered immediately before and after the training show that participants report signi cantly increased knowledge of and con dence in: factors facilitating adoption of clinical innovations; roles and activities of external and internal facilitators as well as site champions; facilitator activities during pre-implementation, implementation, and sustainability phases; and virtual facilitation. Participants also reported signi cantly increased knowledge of and con dence in applying methods to overcome barriers, facilitate stakeholder engagement, develop local implementation plans, and evaluate the IF strategy. It is notable that these ndings were sustained among trainees at 6 months. Comparisons of knowledge and con dence change scores among trainees participating virtually versus in-person show no signi cant differences, suggesting that IF training can successfully be provided without loss of knowledge/skills transfer. While this is encouraging, virtual participation was limited to one site; more work is needed to expand our capability to deliver virtual IF training. For example, future work should seek to increase the number of virtual sites/attendees and develop processes to enhance role play scenarios and other interactive elements of the training through virtual platforms. In so doing, it will be important to continue independent evaluation to ensure that changes made to accommodate more virtual trainees and enhance their experience do not adversely impact training outcomes.
Interestingly, when comparing perceived facilitation knowledge immediately following the IF training to 6month surveys, we again saw no signi cant difference in perceived knowledge in any of the topic areas provided in the IF training. Yet, we did see a signi cant decrease in the con dence to apply half of these skills; speci cally, facilitation activities in each of the three implementation phases, virtual facilitation, and overcoming barriers to change. While we had enhanced participants' access to consultation through o ce hours and the IF Learning Collaborative, these changes occurred during the time of data collection.
Thus, some of the 6-month survey respondents may not have had the opportunity to utilize these resources. Further, we did not collect data from trainees on whether they had actually applied an IF strategy during the 6 months following the training, so it is possible that some trainees could have lost con dence in their IF skills if they had not yet had an opportunity to actually apply what they had learned in the training. In addition, for those 6-month respondents who may have applied an IF strategy during the months following training, the early phases of implementing a new clinical initiative can be some of the most challenging, particularly for new facilitators. More studies that address the transfer of knowledge and skills during the active implementation of a clinical initiative are needed.
Spread of quality improvement and implementation strategies are not without challenges. Pandhi and colleagues faced several challenges when spreading a program to engage patients in quality improvement activities [52]. Primary among the challenges was an inability to recruit health system participants. Competing priorities, leadership changes, and overburdened providers and staff were identi ed as reasons for lack of participation [52]. The IF Training Program has, to date, not encountered similar challenges, possibly because it was developed as a highly partnered program within a single healthcare system, VHA, and therefore met the needs of that system [35]. Thus, the IF Training Program was able to engage a relatively large, established market base for the training from its inception.
Subsequently, rather than initiating formal marketing or dissemination campaigns, we have been able to rely on unsolicited demand for the training through informal channels (e.g., referrals by past trainees), though we have also at times made subtle adaptations to the content and interactive exercises within the training to enhance its relevance to healthcare settings other than the initial healthcare system partner.
Pandhi and colleagues also identi ed a need for ongoing coaching to ensure uptake of the QI process [52]. This is consistent with ndings from the FIRE Study [53,54] in which there were no signi cant differences on clinical guideline compliance between sites receiving one of two types of facilitation support compared to those receiving no facilitation. In a subsequent exploration of factors that contributed to the di culty to implement the clinical guideline, Harvey and colleagues [53] also noted the need for ongoing strong support and mentoring for inexperienced facilitators. While our IF Training Program does not address a speci c guideline implementation effort, the need for ongoing support to ensure transfer of IF knowledge and skills was identi ed in feedback from our trainees. In response to this need and ndings such as those from the Pandhi and the FIRE Study, we established monthly o ce hours calls available to all past trainees and also encouraged them to engage with our IF Learning Collaborative. The degree to which these added elements of mentoring and support are su cient or if, as suggested by FIRE investigators, that role modeling at the local level may be necessary for uptake of facilitation skills, is unknown since our evaluation did not incorporate assessment of results from individual implementation projects as an outcome. To further enhance transfer of successful IF strategies from research to practice, there is also a clear need to develop tools to monitor the delity with which the IF strategy is applied following training. Through a recent scoping literature review and rigorous expert panel consensus development process, we identi ed a set of core IF activities that may be focused upon for IF delity monitoring [55]. Prototype IF delity tools based on those core activities have been developed and are currently being piloted. Once tested and re ned, these IF delity tools will serve as a) another assessment to incorporate into our training program evaluation, and b) an additional resource that trainees may use to monitor and improve their own delity to the IF strategy.

Limitations
While ndings from the evaluation of the IF Training Program are promising, there are several limitations that should be considered. First, the 6-month survey response rate was low (37% of baseline respondents) and may indicate a potential for selection bias if those most engaged in facilitation and applying the knowledge/skills gained at six months were those who differentially responded to the survey. Though this possibility may explain higher self-reported ratings of perceived IF knowledge/skills at six months, it obviously would not explain the signi cant pre/post improvements in these scores immediately before and after receiving the training at baseline. Further, if the possible response bias at 6 months was indeed present, 6-month results may simply be an accurate representation of outcomes among the most appropriate training candidates (i.e., those with a strong interest in IF and with plans for applying the strategy). Unfortunately, since the analysis was conducted by aggregate mean rather than longitudinal repeated measures, we were unable to identify those who most bene ted from the training and perhaps more importantly, those who did not. In addition, while the broad number of roles identi ed by the attendees indicated the breadth of participants' jobs, the lack of designation of a primary role associated with training attendance again decreased the ability to fully characterize the cohort at baseline or at sixmonth follow up. Finally, the current evaluation only allows us to assess trainees' perceived knowledge of and con dence in applying IF strategies, as opposed to incorporating an IF delity metric that could be used to assess how well trainees actually apply the strategy following training. This presents an opportunity to enhance our evaluation methodology once the developmental IF delity tools [55] have been tested and re ned. Regardless, the above limitations are balanced by the fact that this was an independent evaluation and therefore a) minimized social desirability bias and b) included a 6-month follow up to document the degree to which perceived IF knowledge/skills and the ability to apply them were sustained following the immediate training period.

Conclusions
Taken together, our ndings indicate that we have successfully developed and implemented a highly effective Implementation Facilitation Training Program. Not only did trainees report high overall levels of satisfaction with the program, but also reported signi cant pre/post and pre/6-months perceived improvements in both knowledge of and con dence in applying IF skills in multiple domains, including facilitator activities in the three phases of implementation. It is encouraging that perceived knowledge was sustained when comparing post-training reports to those at 6-months. The loss of perceived con dence to apply implementation skills in half of the domains supports the need to develop new and/or expand upon existing mechanisms/resources for keeping trainees engaged with trainers for ongoing consultation and mentoring following the formal 2-day training. Accordingly, this may require more robust funding for these training programs to support trainers' additional time commitment. As aforementioned, the ability to successfully apply implementation strategies is critical to develop a cadre of implementation practitioners poised to enhance implementation of EBPs and other clinical innovations. It is only through a well-trained workforce that we can enhance the quality of care provided.
Additional investigation through qualitative reports of the training experience and long-term outcomes are critical, as is the continued use of an iterative formative evaluation as the training continues to be adapted, re ned, and improved based on participant feedback. Our training program is one mechanism through which the call for enhanced training in implementation strategies has been answered [51], ultimately enhancing the uptake of diverse EBPs. Availability of data and material

Abbreviations
The datasets supporting the conclusions of this article are included within the article and its additional les.
ndings. JAP and TLF analyzed the data in consultation with JEK. JEK, KMD, and JLS reviewed and con rmed ndings. TLF and DRT provided consultation and feedback throughout the project and provided content for the manuscript. NDC performed literature reviews and reported ndings, which contributed to the basis and creation of the manuscript, drafted the abstract, and made substantive revisions throughout the manuscript. All authors contributed to, read, and approved the nal manuscript.