Working in 190 countries and territories, UNICEF advocates for the protection of children's rights, meet children’s basic needs, and expand their opportunities to reach their full potential. UNICEF’s comparative advantage is to work across sectors and across the life-cycle to protect these rights, focusing particularly on protecting the most disadvantaged and vulnerable children. UNICEF’s strategic plan 2018-20219 describes research as a ‘how’ strategy to achieve targets. UNICEF notes that in some situations while substantial evidence exists on what needs to be done, there are evidence gaps when it comes to identifying viable approaches for sustainable implementation and scale-up of responses to improve programs for children.
Peters, Tran and Adams10 note that “The basic intent of implementation research is to understand not only what is and isn’t working, but how and why implementation is going right or wrong, and testing approaches to improve it.” This form of research addresses implementation bottlenecks, identifies optimal approaches for a particular setting, and promotes the uptake of research findings. Ghaffar and colleagues,11 argued that IR should be ‘embedded’ in programming in partnership with policymakers and implementers, integrated in different settings and take into account context-specific factors to ensure relevance in policy priority-setting and decision-making. This view is further supported by Langlois and colleagues.8
UNICEF further defines embedded IR as “the integration of research within existing program implementation and policymaking to improve outcomes and overcome implementation bottlenecks.” UNICEF is adapting this as an innovative approach for health systems strengthening in which research is made integral to decision-making. It includes: a) positioning research within existing programs and systems, building a new evidence ecosystem, and drawing siloed sectors together b) meaningful engagement and leadership roles for decision-makers and implementers within the research team, and, c) when possible, aligning research activities with program implementation cycles. Embedding increases the likelihood of evidence‐informed policies and programs. Embedding IR into the policymaking and systems strengthening process amplifies ownership of evidence, recognizes decisions are not made on evidence alone and also takes societal context and values into account.
Varying definitions of operational, implementation and systems research often cause confusion for both researchers and program managers. For example, the distinction between operations research and IR has been debated12 as the two types of research are often similar in intent and scope. UNICEF uses definitions outlined by Remme et al.,13 where operational research is a subset of IR, which is a subset of broader health systems research. These definitions are progressive without clear delineations, so many times operational, implementation and health system research overlap. Similar issues exist across other terms, such as formative research, process evaluation, translational research. UNICEF has primarily adapted taxonomy as described by Peters et.al and others.10,11,13
Research on innovations encompasses basic science through translation to sustainable implementation of new or existing evidence-based programs.10 IR in UNICEF focuses on issues ranging from how a program works in real-world settings to systems integration and scale-up, including decision-making, policy development, and creating an enabling environment for implementation.
IR to support UNICEF programs is generally located within monitoring, evaluation and learning, as a basis for program innovation, systems strengthening and implementation. IR does not take the place of routine data collection, monitoring and evaluation, but is a complement to it. Figure 1 provides a conceptual model for how IR fits into the overall evidence cycle for program management. As shown, if a problem is identified and the solution to the problem or an effective implementation strategy to address the problem is known, it is a management issue which can be resolved without the need for IR. However, if a problem is identified and the solution is unknown, or reason for the problem is unknown, IR can be used to inform program adjustments.
IR questions address a wide range of issues pertinent to implementation including acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, coverage and sustainability.10 These questions generally flow along the continuum of the implementation process: exploration, installation or planning, initial implementation, and full or sustained implementation.14
Embedded Implementation Research at UNICEF: Experiences 2015-2019
UNICEF, in partnership with the Alliance for Health Policy and Systems Research (The Alliance) and the Special Program for Research and Training in Tropical Diseases (TDR), has adapted an embedded IR approach for systems strengthening for country-based programs for children. As part of the UNICEF Health Systems Strengthening Approach,15 the research seeks to catalyze a shift in the way evidence is generated and used within countries to inform policy and decision-making. By bringing together a) in-country decision makers at national, sub-national and local level, b) country-based researchers, and c) global development partners, it puts local decision-makers and implementers in the driving seat in the research process, while identifying clear roles for different stakeholders.
The UNICEF approach aims to enhance ownership of the research among local implementers. Although it may influence local or higher-level policy, it is primarily designed to prioritize research on questions of local relevance, build capacity to conduct local IR to generate feasible recommendations in ‘real-time’ and underwrite policy and system strengthening. IR can also be considered during program initiation to answer questions on the acceptability, appropriateness and feasibility of alternative delivery strategies, as well as blending with evaluation, using “effectiveness-implementation hybrid designs”,16 to address program effectiveness.
UNICEF works with both global and local partners to identify priorities on implementation barriers needing resolution through systematic and inclusive IR processes. Programs, through IR, can learn why implementation barriers and contextual variances mean that interventions work well in one context, but not in another. In addition, it can be used to test new approaches or innovations from pilot through scale-up across different contexts (e.g. geographic, political, physical environment, social, economic). Implementation research can also document failures, where an intervention success couldn’t be replicated given local context, which is equally valuable to prevent wastage of funds before investing in scale-up.
UNICEF’s embedded IR approach generally starts with sensitization of national stakeholders including Ministry of Health and policymakers to what IR is and what the potential benefits are. Implementation barriers, often previously identified through national, sub-national or local programme reviews, or monitoring and evaluation, are reviewed, summarized and prioritized. In collaboration with national stakeholders and policymakers, implementation barriers are then transformed into priority research questions and potential related IR studies are identified. A research team comprising a partnership between national policy-makers, local decision-makers and implementers (e.g. program managers, district managers, front-line health workers), and in-country researchers is convened. Through this process, UNICEF staff, in partnership with implementation researchers or research institutions (global or local), provide technical support and training to develop protocols, ensure ethical research standards are maintained, conduct studies, and support communication of results, recommendations, and use of the findings for policy and program changes. Figure 2 provides an example of the UNICEF embedded IR process.
Figure 3 shows countries where UNICEF has worked with partners to embed IR within existing programming activities. In collaboration with in-country researchers, policy-makers and program implementers, UNICEF has been supporting IR projects globally since 2015. The research has varied from formative early stages of programming, through initial implementation, to full implementation. Methodological approaches varied by study and included quantitative, qualitative and mixed-methods. Study questions have addressed a wide variety of multi-sectoral topics (e.g. immunization, child health days, birth registration, newborn and child health in humanitarian settings, prevention of mother-to-child transmission of HIV) and health system challenges ( e.g. information systems, human resources, supply chain, demand for services, community engagement, integration).
Measuring Embedded Implementation Research Success
UNICEF’s selected measure of success for embedded IR is that the research findings are used for policy and/or program changes. This near-real-time use of findings is key. Dissemination and publication of the results alone does not count as ‘use’, policy or program changes should be documented, no matter how small or local. Selected examples include:
- Nigeria: Integration of participatory action research approach into existing social mobilization structures in areas with pockets of unimmunized children. The research team developed policy and supporting documents to guide policy makers in this effort.17
- Ethiopia: Improvement of data quality and use within the existing health information system through community engagement. The team used their research findings to develop guidance on the use of community health data for shared accountability.17
- India: Examined how to counter anti-vaccine propaganda leading to a revised communication strategy, including a mobile social media app to target the primary platform for anti-vaccine messaging.17
- Malawi: Examined needs of HIV-positive adolescents within a mentoring support program. Based on study results the mentor program hired and trained Adolescent Champions (same age peers) within three months of study completion to provide special services for adolescents in the program.18
In addition to monitoring program results, we have also documented government implementers’ and local research partners’19 experiences in the program. Results suggest positive responses to participation in the process. Text box 2 cites two quotes from a project in Pakistan, one from before the IR and one after, which exemplify the common responses seen from participants.
Funding Embedded Implementation Research at UNICEF
To date, UNICEF supported IR studies have been funded almost exclusively through projects, as part of the program monitoring or learning agenda. Studies are typically short-term and require limited funding. For example, five HIV-related IR studies in 2017 cost US$15-35,000 each for data collection and analysis, and were completed within five-months. Projects to date range from $10,000 to $70,000 (usually $20,000-$40,000) and 5 to 18 months (usually 12 months). This near real-time aspect of embedded IR is recognized as one of the advantages as research results can be available within planning cycles for decision making and program adaptation. While the research projects themselves generally require limited funding, building local capacity to run the studies, both for the implementers and local researchers, often requires additional resources beyond the actual research costs for training and technical assistance during the studies.
Key challenges and considerations in developing UNICEF’s embedded IR approach
IR has been recognized as critical to strengthen health systems.11,20,21 However, the concept of embedding research into real-world policy, practice and implementation is somewhat new in the field of global children’s programming, and uptake of the approach has challenges. For example, we have found that in-country partners, including local and national-level government counterparts, and some donors, need to be convinced about the value of IR. By engaging stakeholders in this approach, we have seen a recognition of the use of IR for programme planning and enthusiasm for continuation of the implementer-researcher partnerships after completion of the initial IR project. Also, for IR to be truly country driven, donors and partners have to trust that countries can identify the most relevant implementation barriers, transforming them into questions to investigate. This will require adaptations for review of research proposals, which may be more undefined regarding objectives and methods, given that these will be defined as a first step of the research process. Greater emphasis on domestic resource mobilisation for embedding IR into the decision-making process and into routine program funding is needed. This problem can be overcome by advocacy, showcasing the value IR brings, building the capacity of partners on IR, and engaging them early and throughout the research activities so as to address their stated priorities, gaps in knowledge, and improve policy and ownership. Weighing opportunity costs between investing in service delivery and/or implementation research requires continued focus, as turnover of leadership and staff can reverse gains made in many contexts.
Another challenge is that in some cases our partner implementers and researchers have been overly ambitious or wanted to pursue larger scale research. However, our experiences show that time-limited, small-scale and relatively inexpensive IR studies can lead to important learnings that have translated into changes in policies or approaches. We saw that the IR brought these two communities closer together with the benefits of greater relevance of research to programming, introduction of new methods, and faster implementations of specific solutions. To build incentives for researchers to do IR, showing how their research led to policy and practice change and not just traditional peer reviewed publications could be valued by universities as a component for promotion or research funding.
Assessment of research quality and how the results are contributing to the existing evidence base also needs to be addressed. Many IR studies while being used locally for in country program improvements may not be published in peer-review literature, but nevertheless could contribute to expanding the evidence base on implementation strategies. Therefore publishing in on-line platforms, such as the TDR Gateway (https://www.who.int/tdr/publications/tdr-gateway/en/) or similar sites, will allow for quality-assurance and rapid wider dissemination.