The need for training programs to transfer implementation knowledge and skills
Complex evidence-based clinical innovations are challenging to implement, with many settings requiring implementation assistance to increase likelihood of innovation uptake with fidelity. With implementation science emerging as an established field of scientific inquiry, researchers are identifying evidence-based implementation strategies that support the uptake of evidence-based practices (EBPs) and other clinical innovations [1, 2]. Yet, little is known about how best to transfer the requisite knowledge and skills to effectively apply evidence-based implementation strategies to those who can use them in non-research initiatives.
There have been many calls to develop infrastructure to ensure that gains made in advancing implementation science knowledge are effectively transferred to clinical and operational healthcare leaders [3–5]. Proctor and colleagues [6] engaged healthcare leaders, funders, and researchers to identify areas needed to support sustainability of evidence-based healthcare innovations. One of the eleven domains identified for increased attention was capacity development or training the healthcare workforce. These leaders called upon foundations, universities, and professional associations to better train practice leaders and frontline providers in evidence-based strategies that introduce, implement, and sustain advances in healthcare. Thus, there is a need to develop rigorous and tested methods to educate implementation practitioners on the use of evidence-based implementation strategies, thereby ensuring more rapid transfer in putting implementation science knowledge into the hands of those who can apply it more broadly to enhance clinical and public health impact [7]. In addition, once skills are obtained by implementation practitioners, even less is known about effective approaches for helping them sustain these competencies over time.
Introduction to implementation facilitation
Evidence supporting implementation facilitation (IF) as an effective implementation strategy is expanding, and an increasing number of projects are using IF strategies to implement clinical innovations [8–19]. IF is a multifaceted strategy involving a process of interactive problem-solving and support that occurs in a context of a recognized need for improvement and supportive interpersonal relationships [20]. IF typically bundles an integrated set of activities and strategies to support uptake of effective practices, including but not limited to: engaging stakeholders, identifying local change agents, goal-setting, action planning, staff training, problem-solving, providing technical support, audit/feedback, and marketing [21–24]. The activities that facilitators apply and when they apply them depends upon facilities’ needs over the course of the implementation process [23, 25, 26]. To apply all of these activities in a way that is responsive to a setting’s needs over time, facilitators require a high level of flexibility, communication, and interpersonal skills [23, 27].
Facilitation as an evidence-based strategy
While our application of IF is guided by the integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework, multiple conceptual frameworks and planned action models accommodate the use of IF as a primary implementation strategy [10, 28]. The i-PARIHS framework proposes that successful implementation of EBPs and other clinical innovations is the result of facilitation of an innovation in partnership with recipients, considering characteristics of the innovation itself and the recipients’ inner and outer contexts. Successful implementation is defined as achievement of agreed-upon goals, uptake and institutionalization of the innovation, engaged stakeholders who ‘own’ the innovation, and minimization of variation related to context across implementation settings [29]. Among i-PARIHS constructs, facilitation is the active ingredient, with facilitators supporting implementation by assessing and responding to characteristics of the recipients of the clinical innovation within the context of their own settings. An increasing volume of literature supports IF as an evidence-based strategy for implementing EBPs and other clinical innovations [8–10, 14, 19]. Systematic reviews have found that primary care practices were almost three times more likely to adopt evidence-based guidelines through the use of facilitation [18], and multiple studies conducted across diverse clinical settings have shown IF to be an effective strategy for implementing new clinical programs and practices [8, 10–15, 17, 19].
Need for training programs and evaluation of them
Training programs and workshops are common ways to convey knowledge to participants about new professional competencies [30]. When developing these training programs and workshops, it is important to evaluate their overall impact and the specific components that are effective. Therefore, a robust evaluation plan needs to be created in conjunction with the creation of new training programs. Multiple methods exist for assessing classroom learning, including written examinations, direct observation, supervisor assessments, clinical simulations, and multisource assessments [31]. These classroom-based assessments can also be used to assess learning outside of typical student training programs, such as in professional development workshops [31]. Historically, training and workshop evaluations focus on understanding changes in participant behavior as a result of attendance and typically show small effects [32]. However, educational researchers are increasingly evaluating other variables, including how trainings impact participants, why they impact participants, and understanding other contextual factors that may contribute to the changes [32]. Further, evaluation of core skills, with follow-up across time, to assess not only acquisition but also sustainment can substantially enhance training program evaluation. Ideally, training program evaluation should be multifactorial, include both a pre- and post-assessment, and be followed by additional evaluation at a later point in time.
An increase in empirical support for IF has resulted in a demand to ensure there is an available, highly skilled workforce with the core competencies required to successfully serve as an implementation facilitator. Unfortunately, there are few individuals who have the necessary training, skills, and knowledge to effectively apply IF to provide evidence-based implementation support. Thus, there is a need to ensure new facilitators receive timely training and ongoing mentoring for continued learning and professional development.
To address this gap, we developed, initiated, and evaluated an IF training program for both implementation researchers and practitioners. Below, we describe the development of the training program, its components, and outcomes from an independent evaluation of a subset of those who have participated in the training.