Program Status
This program evaluation was approved by VA Central Institutional Review Board (C-IRB E19-05) on September 23, 2019 and the VA Tennessee Valley Healthcare System Research & Development Committee on November 21, 2019. The VA Organizational Assessment Subcommittee approved the study on October 1, 2019 and the VA Office of Labor and Management Relations (national union approval) approved the study on January 27, 2020. This report was approved by the VA Office of Rural Health on March 17, 2020.
Aim
The aim of this program evaluation is to develop and report lung cancer screening workflow and clinical outcome measures as well as contextual factors that influence implementation at sites participating in VA-PALS.
Implementation Strategies
Each VAMC participating in VA-PALS has a designated site champion and will receive the following: (1) a full-time lung cancer screening navigator (NP or PA) to support program development and coordinate screening-related care], (2) a comprehensive, open-source lung cancer screening management software system33, 35 (3) training for navigators and radiologists, (4) expert guidance to standardize screening protocol adherence and LDCT imaging quality, and (5) a support network led by experienced screening leaders.
Decisions regarding individual site’s lung cancer screening program structure and workflow will be determined at the local level depending on the environment, resources available, and culture. As such, sites will be able to adapt their processes over time for different reasons (e.g., contextual pressures, leadership support, trial and error, etc.). These changes will be measured to enhance our understanding of how naturalistic implementation influences processes and the impact these changes may have on overcoming barriers.
Logic Model (Figure 2)
Inputs - People, Technology
People. VA-PALS will be overseen by an inter-professional leadership team of radiologists, a pulmonologist, a radiation oncologist, a screening navigator, and technology specialists at Atlanta VA Medical Center, Phoenix VA Health Care System, Early Lung and Cardiac Action Program at Mount Sinai Health System (MS-ELCAP), and Paraxial LLC who have expertise in lung cancer screening, treatment, research and healthcare computer science.35 The software team is comprised of software developers with experience in VA networks and informatics experts at MS-ELCAP. The VA-PALS program evaluation team includes health services researchers at VA Tennessee Valley Healthcare System Geriatric Research, Education and Clinical Center and Vanderbilt University Medical Center working together with the VA-PALS leadership team and participating VAMCs. Participating VAMCs have or will have local teams that include a site champion, program director(s), screening program navigator, clinical providers, and software engineers.
Technology. The ELCAP Management System was designed to manage and organize lung cancer screening activities and has been in use since 1992.35 This software system has been used at 82 institutions in 10 countries and is a tool for navigators and radiologists to document, organize, and manage screening-related activities from the initial contact, screening follow up, evaluation of screening findings, and treatment of those diagnosed with lung cancer.2, 4, 35 This software also serves as a database for improvement initiatives.35 Other technological inputs include the CT scanners for screening examinations and small nodule phantoms of the CT scanning protocols for quality assurance.35
A phantom is a specially designed object that can be scanned by a CT scanner and provides the information about the performance of the CT scanner and scanning parameters which can be used to optimize images CT screening.
Activities
Support Network Activities. The leadership team will provide guidance and support for each VA-PALS program through planned activities and site monitoring. It will provide standardized training for all newly hired navigators and radiologists at each VA-PALS site along with a support network for the navigators, site champions and program directors through regular conference calls, national workshops, and an operational guide to lung cancer screening. The leadership team will also manage a website (www.va-pals.org) to disseminate information and shared resources.
Quality Assurance Activities. The software team will translate the ELCAP Management System into the open source VAPALS-ELCAP Management System with guidance by the MS-ELCAP and leadership team. The software documents enrollment, screenings, follow-up, interventions, and treatment if diagnosed with lung cancer. It also provides management and quality assurance reports of screening protocol adherence and follow-up. This VAPALS-ELCAP Management System will then be integrated with the Veterans Health Information Systems and Technology Architecture (VistA) and the VHA’s electronic health system electronic medical record, Computerized Patient Record System (CPRS).35 It will be tested and piloted at the Phoenix VAMC prior to dissemination to other participating VA-PALS VAMCs. The leadership team will provide each VAMC with an imaging phantom and the software team will provide feedback on their scanners’ imaging quality.35 Phantoms will measure imaging sensitivity, dynamic range, contrast/detail detectability, and spatial resolution. This calibration tool will ensure proper parameters are being used for LDCT imaging to assure high-quality assessment of lung nodules.35 Local teams at each VAMC will use the VAPALS-ELCAP Management System and its quality assurance reports and the CT phantom assurance tools for program activities, adherence to the screening protocol, standardization of image parameters, and improvement initiatives.
Training Activities. Navigators will be trained by the MS-ELCAP and leadership teams on screening eligibility, shared decision-making, use of the management system, and coordination of screening-related activities. Radiologists will also be trained in the workflow and use of the VAPALS-ELCAP Management System.
Local Lung Cancer Screening Activities. Each site director will be tasked with hiring a lung cancer screening navigator (NP or PA), engaging their local stakeholders and administrators, developing a screening workflow, and educating local providers and staff about the program. Implementation of the lung cancer screening program will include the following activities: identify high-risk Veterans eligible for screening, perform shared decision-making, offer resources and/or treatment for smoking cessation, perform the LDCT scan, ensure structured radiology reporting, facilitate any indicated additional workup, and schedule either repeat annual LDCT or work up of findings suspicious for lung cancer as appropriate.
Data Collection Activities. The primary clinical and implementation outcome measures were developed by the VA-PALS leadership team, as part of their continued overall responsibility, together with the program evaluation team. The combined teams will work together to collect data from the management system, VHA’s Clinical Data Warehouse (CDW), surveys, and interviews, will analyze results, and will share the data with the larger VA-PALS network. The program evaluation team will also participate in leadership and program meetings.
Outcomes
Short-term Outcomes: The short-term outcomes consist of VA-PALS’s reach, adoption, effectiveness and implementation measures to be used within the first 24 months of the program, described in detail below and Table 1.
Long-term Outcomes: The long-term outcomes are the reach, implementation, effectiveness and maintenance measures to be used beyond 24 months since start of the program, described in detail below and in Table 1.
Target Population. The target population will include Veterans eligible for lung cancer screening, providers and staff involved in screening activities, and leadership at each VA-PALS site. Veterans eligible for lung cancer screening generally include those meeting the USPSTF criteria12; at least one site is using a more broad eligibility recommendation.37 Providers involved in screening activities will include primary care providers (physicians, advanced practice providers, and physicians-in-training), radiology providers, and navigators for each program. Staff involved will include primary care nurses, schedulers, clinical application coordinators (CACs) and radiology technicians/technologists. Leadership will include the local program leadership as well as service leaders within primary care, radiology, and specialty services and executive leadership team.
Theoretical Frameworks
a. Implementation Framework
Implementation of an evidence-based practice into healthcare delivery depends upon multiple internal and external contextual factors that ultimately influence the processes and outcomes of healthcare delivery. We selected the Consolidated Framework for Implementation Research (CFIR) to guide this program evaluation based on this framework’s ability to provide structure to approaching the implementation of a complex, inter-professional program in a pragmatic manner.38 The CFIR explores the characteristics of an evidence-based practice, the inner and outer settings of where the evidence-based practice is deployed, the characteristics of the individuals interacting with the evidence-based practice, and the process of implementing the evidence-based practice.38 This mixed-methods program evaluation measures key elements of the five major CFIR domains: (1) characteristics of the evidence-based practice (image acquisition and CT standardization), (2) inner setting of each VA-PALS site (organizational readiness, resources) and (3) outer setting of VHA as a whole (policies), (4) the characteristics of individuals involved in the screening process (knowledge and beliefs, self-efficacy, and motivation), and the (5) process of implementation at each VA-PALS site (site process maps for implementation initially and adaptations over time) (Table 1). Thoroughly exploring each of these domains will inform future implementation strategies (Figure 3).38, 39
b. Evaluation Frameworks
The Reach, Effectiveness, Adoption, Implementation, Maintenance (RE-AIM) framework was used to develop the program evaluation measures.40 This evaluation framework in the cancer screening setting41 and provides a comprehensive structure to assess real-world clinical programs. VA-PALS evaluation measures reflect the activities at each VAMC and clinical and process outcomes of each lung cancer screening program. For each site, the date of hire of the navigator will be considered the program start date. Evaluation of all measures of reach, effectiveness and maintenance will begin at the program start date and continue for the next 36 months, as defined by the navigator hire date. The VA-PALS leadership and evaluation teams will compare measures (i.e. number of Veterans reached, screening utilization, etc.) over time in VA-PALS and non-VA-PALS VAMCs in an interrupted time series analysis. Further, we will compare outcomes in months 1-24 to months 25-36 to evaluate for sustainability in VA-PALS and non-VA-PALS VAMCs (Table 1).
Reach
The management system will capture descriptive information (e.g. age, smoking history, rural status), method of contact for screening (referral from primary care, specialty care or smoking cessation, outreach or self-referral). In order to have a consistent denominator of the eligible population across all sites, we will estimate the number of Veterans potentially eligible for lung cancer screening according to the USPSTF criteria. As used before,32 we will obtain counts of Veterans between the ages of 55 through 80 seen at each VAMC each year. To calculate the percentage of age-appropriate Veterans with an eligible smoking history, we will multiply the age-appropriate population by 32% for each VAMC in each year; thirty-two percent was the percentage of age-appropriate Veterans who met the smoking eligibility criteria in the VHA’s Clinical Lung Cancer Screening Demonstration Project.26 All of the components of the program’s reach will be designed to capture the number of eligible Veterans at each stage of the process including Veterans approached for screening, eligibility as well as agreement and refusal both before and after the shared decision-making process, and unique Veterans completing initial screens (Figure 4).
Effectiveness
The effectiveness of the program will be evaluated using data acquired from VHA’s Corporate Data Warehouse (CDW) and the VAPALS-ELCAP Management System. The measures chosen are reflective of the results of screening, the frequency and type of follow-up procedures performed, the number of cancers detected, type and stage of cancers, treatment received, other abnormal findings not suspicious of lung cancer and their follow up (e.g. cardiovascular disease, COPD, etc.),42 survival of screened Veterans, the number of smoking cessation quit attempts,14, 16 the type of smoking cessation services provided, wait times for services provided, and overall experiences of those involved in the screening process (see Qualitative Section) (Table 1). We will compare different measures of effectiveness (number of LDCT scans, cancers diagnosed, stage at time of diagnosis, etc.) before and after implementation of the navigator and VAPALS-ELCAP Management System using an interrupted time series analysis. Other measures of effectiveness (wait times for services provided, workflow process maps, and overall experiences of those involved in the screening process) will also be tracked over time using the CDW, process mapping and surveys.
Adoption
Adoption will be assessed by describing the characteristics of the VAMC facilities participating in VA-PALS. Organizational readiness for change will be measured using a validated survey tool to assess change valence, change commitment, and change efficacy34, 43 to understand each VAMC’s inner setting (Table 1). Organizational readiness for change is defined as the “extent to which organizational members are psychologically and behaviorally prepared to implement organizational change.”44 Change valence is defined as the organization’s members belief that pursuing change is beneficial and valuable to the organization.45 To describe the program implementation process, we will obtain interviews from sites and program leadership: (1) first date of posting the navigator position for hire by human resources, (2) first day navigator starts clinical work, (3) date of navigator training, (4) date of installation of the VAPALS-ELCAP Management System, and (5) dates of imaging phantom use and CT scanner standardization. The number of unique providers and types of providers (specialty; physician, advanced practice provider) referring to the program within the first 24 months since hiring of the coordinator will be captured in the CDW and VAPALS-ELCAP Management System. Finally, to describe adoption at the site-level, we will capture alignment with VHA priorities through in-depth interviews of program leadership at each site.
Implementation
Implementation of lung cancer screening programs at the 10 VA-PALS sites will be captured via local program engagement to assess fidelity of the program (Table 1). Specifically, fidelity refers to the degree to which navigators, the management system, trainings, screening protocol adherence, and imaging phantoms are utilized at each site. Measures of fidelity will include the retention time of navigators, navigator workload, the number of radiologists who use the management system, the degree to which the navigators use the management system for screening related activities (approach/enroll Veterans, write notes, follow-up screening results, manage downstream follow-up, etc.), and the dates of phantom use and image standardization (Table 1). Measurement of each VAMC’s inner setting will assess team characteristics, the processes involved in screening, and barriers and facilitators. We will assess how smoking cessation is incorporated into each local program (who performs it, type of services performed, when it is performed within screening), how shared decision-making discussions are performed, and how and when results are disseminated to patients and referring providers (see section on Process Maps below). These measures will be primarily obtained through interviews with sites and supplemented with data from the VAPALS-ELCAP Management System (Table 1).
Maintenance
We will measure the extent to which VA-PALS is sustainable over time using a combination of data collected from the VA-PALS-ELCAP Management System and interviews with local program leadership. Maintenance measurements will consist of: number of subsequent screenings, screening workflow adaptations during months 25-36, the number of unique Veterans enrolled in each program during months 25-36, the number of unique providers who refer Veterans in each program during months 25-36, navigator retention, and sustained leadership buy-in (Table 1). We will also continue to track radiologists’ and lung cancer screening navigators’ continued adherence to the VA-PALS ELCAP Management System per the fidelity measures described in the Implementation Section above.
Process Maps and Adaptations
Each VA-PALS site will describe their initial processes of clinical workflow, which will be used to generate a process map at the start of implementation. These process maps will be generated via telephone interviews with each site’s lung cancer screening navigator. Each map documents the site’s method of conducting patient identification, eligibility confirmation, smoking cessation counseling, shared decision-making, appointment scheduling, LDCT appointment, communication of results and follow-up scans. Each site will be contacted every 6 months to review their process map and delineate workflow adaptations as implementation matures. Using the Framework for Reporting Adaptations and Modifications to Evidence Based Interventions (FRAME),46 we seek to understand how and why processes are adapted as well as how these adaptations influence implementation outcomes over time. Adaptations will be noted across the following FRAME domains: 1) when and how over the course of implementation were modifications made, 2) were these changes planned and proactive (e.g., intentional adaptation) or unplanned and reactive (e.g., in response to other forces), 3) who decided changes were necessary, 4) what was modified, 5) at what level in the system of delivery were changes made, 6) what types of changes were made in content or context, and 7) the reasons modifications were made and what contextual drivers were at play. Once process maps are developed or revised, they will be reviewed with each site for accuracy. Adaptations will be correlated with implementation outcomes (workflow, barriers and facilitators).
Qualitative Evaluation
Qualitative data will supplement the quantitative measures to recognize elements that were successfully implemented, and elements that posed a challenge at each site (Table 2). In-depth interviews among providers, staff and leadership at five sites with different process maps will be used to assess barriers and facilitators around the main factors related to implementation. Key questions asked will be: (1) “Tell me about what role you currently have in the lung cancer screening program at your VA” (CFIR individual/team characteristics); (2) “What is difficult about implementing a lung cancer screening program?” (RE-AIM implementation); (3) “How well does lung cancer screening fit with existing processes and practices in your VA?” (inner setting, implementation climate, compatibility); (4) “Are meetings, such as staff meetings, held regularly to discuss work process and practices such as lung cancer screening?” (inner setting – network and communication); (5) “What could help you to continue to perform high-quality LDCT screening in your VA?” (RE-AIM maintenance). Targeted participants for interviews include primary care and radiology providers, staff, and leaders. All interviews will be audio-recorded and transcribed for subsequent thematic analysis, a standardized system that produces valid and reliable interpretations. All transcripts will be coded to establish a hierarchical coding system that categorizes comments contextually into themes that occur across multiple interviewees.47, 48