Our research team included clinicians and researchers from 4 health systems specializing in pulmonary and critical care medicine, implementation science, learning health systems, and organizational behavior. The larger project was a planning grant from the US National Heart, Lung and Blood Institute of the National Institutes of Health (U01HL143453, Sales and Gong co-PIs), called Digital Implementation Trials in Acute Lung Care (DIGITAL-C), with the ultimate goal to plan a multi-site hybrid type 2 implementation-effectiveness trial of digital implementation strategies. We report abridged methods here; more detailed background and methods are available in Additional File 1.
We engaged in a 4-step multi-method process to evaluate and prioritize EBPs to assist in clinical decision-making and concentrate future implementation efforts. Throughout this process, we focused on EBPs most relevant to and strongly associated with improved clinical outcomes (i.e., shorter duration of mechanical ventilation and/or lower mortality), as identified in previously established guidelines, among patients receiving invasive mechanical ventilation for acute respiratory failure or acute respiratory distress syndrome. In Steps 2–4, we also considered the feasibility of using digital data, extracted from electronic health records, to assess EBP performance, rather than processes of human data abstraction.
<Table 1 about here>
Table 1
The 4-step prioritization process overview
Step | Definition | Participants | Measures | Data and Output |
1: Identify potentially relevant EBPs | Compile a list of EBPs that are most relevant to ARF/ARDS and supported by the most robust data | DIGITAL-C research team | Mortality benefit; shorter duration of MV; importance | Initial list of EBPs related to ARF/ARDS |
2: Assign ratings to EBPs | Distill the list of EBPs using surveys | DIGITAL-C clinician researchers | GLIA 2.0 instrument | Qualitative and quantitative assessments of EBPs; a distilled list of EBPs based on survey results |
3: Frontline clinician panel | Evaluate the distilled list of EBPs from Step 2 using surveys | Frontline clinicians | Abbreviated GLIA instrument: measurability; resource intensiveness; source credibility | Quantitative and qualitative evaluation of the distilled list of EBPs |
4: Final synthesis | Identify final list of EBPs that will be the focus of implementation efforts | DIGITAL-C research team | Measurability; variability in practice | Highest rated EBPs based on Steps 3 & 4 |
ARDS = acute respiratory distress syndrome; ARF = acute respiratory failure; EBPs = evidence-based practices; GLIA = guideline implementability appraisal; MV = invasive mechanical ventilation |
An overview of our 4-step prioritization process is shown in Table 1. In Step 1 (see Ervin et al.[8]), clinician experts from our research team identified key guidelines that included several EBPs, and we searched the literature for related reviews.
Step 1 is reported in our previous paper, in which we describe the 20 EBPs that we collated from the literature review.[8] Step 1 was conducted from July-December 2018. The focus of the current report is on Steps 2 and 3.
In Step 2, clinician researchers on our team (MWS, MNG, TJI, CLH) rated a list of 26 EBPs generated from Step 1 using two Qualtrics surveys. The first survey contained the full set of criteria from the Guideline Implementability Appraisal 2.0 (GLIA) tool, which we show in Table 2.[9]
Table 2
Guideline Implementability Appraisal 2.0 (GLIA) variables and definitions
Variable | Definition |
Measurability | Endpoints or markers are well identified in this EBP to make it easily measurable and computable in the EMR |
Clarity of execution | EBP is clear on how the recommendation should be executed, with "what" and "how" defined, including step-by-step instructions |
Decidability | EBP has high clarity as to under what conditions to perform the EBP (e.g. age, gender, clinical findings, laboratory results |
Validity | Recommendation highly reflects the intent of the developer and the quality of evidence |
Flexibility | The recommendation permits interpretation and allows for alternatives in its execution |
Effect on process of care | The recommendation can be carried out without substantial disruption of current workflow or significant increased need for resources |
Novelty/innovation | The recommendation proposes behaviors considered new and unconventional by clinicians (or patients) |
Resource intensiveness | Whether the EBP is resource intensive |
Clarity of target population | The guideline clearly defines the target patient population |
Source credibility | The organizations and authors who developed the guideline have credibility with the intended audience of the guideline |
Consistency | The recommendation is consistent among other authors in the literature and your understanding of evidence based practice |
EBP = evidence-based practice; EMR = electronic health record; |
<Table 2 about here>
In this and subsequent steps, we used radar graphs (Figs. 1–3) to assess responses across all of the GLIA dimensions concurrently. To construct these graphs, we averaged the responses and plotted them on the 11 axes of the GLIA dimensions, using Microsoft Excel. At this point, following discussion focused on the radar graphs, we removed a total of 15 EBPs from the list, leaving 11. Our primary criteria focused on measurability, resource intensiveness, and source credibility. We conducted Step 2 in February-May 2019.
In Step 3, frontline clinicians from 8 participating hospital systems evaluated the distilled list of 11 EBPs produced from Step 2 using a much-reduced survey instrument. We reduced the survey to 3 GLIA elements (measurability, resource intensiveness, and source credibility) based on the experience of the team clinicians who engaged in the first 2 rounds of surveys, who felt that the full GLIA was very burdensome. We conducted these surveys in June-September 2019.
We surveyed clinicians directly involved in caring for patients receiving invasive mechanical ventilation, including attending physicians, house staff, nurse managers, registered nurses, and respiratory therapists. We used Qualtrics as the platform for the survey. Clinicians were asked whether we should include the EBPs in the final list for implementation (Yes; No; Maybe), and then rated the EBPs on 3 GLIA criteria: measurability; resource intensiveness; source credibility. Descriptive statistics were calculated in Microsoft Excel; missing data were omitted using pairwise deletion.
The 4th and final step, not addressed in this paper, was to use all of the available data to generate the final list of EBPs to focus future implementation efforts. This step required developing metrics for each of the included EPBs using digitally extracted data from electronic health records, which we will report in subsequent papers. We placed heavy emphasis on measurability and variability in practice within and across ICUs and health systems.
This study was deemed exempt from human subjects oversight by IRBMED at the University of Michigan. Data were gathered throughout 2019.