Reliable data collection is necessary to accurately assess outcomes but has been a major challenge at many PR-COIN sites, with our site being one of those. Aided by quality improvement processes, we achieved our aim of increasing CDE documentation for arthritis pain, PtGA, PrGA, and AJC by first characterizing our data collection and then focusing on provider buy-in, frequent reminders, and individualized feedback. Data characterization also demonstrate that the capture of CDEs occurred significantly less often with virtual than with in-person visits; however, when assessing our improvements, we grouped visit types together due to a sharp decline over time in virtual visits, limiting assessment of improvements for these separately.
Improved documentation of PROs occurred several weeks prior to improvement in provider-assessed measures, despite initially targeting arthritis pain and PrGA. We postulate that multiple factors contributed to these differences, including the use of an intake form for PROs, appearance and position of CDEs on the SmartForm, time commitment, and provider level of comfort associated with each CDE. Similar to the majority (84%) of PR-COIN sites, we use paper intake forms to collect PROs during in-person visits (9). Parallel improvements were observed for arthritis pain and PtGA, which we hypothesize was due to the use of an intake form and the location of these PROs at the top of the SmartForm. The difference in time required to obtain two PROs rather than just one is minimal, which likely explains why PtGA improved in parallel with arthritis pain despite our not targeting PtGA initially.
Improvement in collection of PrGA did not occur until week 45, despite targeting it at week 13. Providers disclosed having no formal education on how to rate a PrGA and discomfort with assessing PrGA virtually. The lack of concordance of interrater scoring for PrGA has been demonstrated previously(12) and highlights the need for systematic training and well-defined guides for rating PrGA. We postulate this intervention would result not only in more reliable data collection but also a more accurate assessment of the patient’s clinical status. Additionally, PrGA is located at the bottom of the SmartForm which requires scrolling and may have contributed to lower data capture initially.
Documentation of AJC promptly occurred after we set an aim to improve its collection. This was observed as virtual visit frequency declined. Limitations discussed for AJC capture included redundancy and extra time commitment counting the AJC number since provider notes had more descriptive joint findings.
Provider interest in personalized feedback regarding patient outcomes, specifically patient disease activity scores, as opposed to the provider’s ability to capture data seemed to motivate documentation. The cJADAS10 requires collection of PtGA, PrGA, and AJC for each patient. The proposal to collect and report feedback on patient disease activity during M3, resulted centerline shifts > 90% for all CDE.
Limitations
Generalizability is a limitation since institutions have different processes, resources, EMRs, and virtual visit frequencies. For example, our utilization of the SmartForm was the major tool for tracking the CDEs. Not all sites have the same EMR system or SmartForm. Additionally, resource differences, such as paper intake versus electronic intake forms, would likely contribute to differences in data collection. Even though these processes may differ, the concepts of tracking data and providing individual feedback can be generalized.
Another limitation included the decline in virtual visit numbers over time which may have skewed the data when comparing virtual to in-person data collection. We attempted to compensate for this by splitting data into phases.
Future directions
Reliance on manual chart review was key to this project’s success. In the longer term, a more automated process is needed. Ideally, we would have electronic questionnaires that input PROs directly into the SmartForm. Such a platform for data entry would likely improve data capture for both in-person and virtual visit PRO collection. The SmartForm is intended to facilitate extractable data, which is pushed into the PR-COIN Registry, a centralized database whose output is similar to the feedback reports we have been manually generating. Feedback reports created by PR-COIN and presented to individual sites could facilitate improved data entry for all metrics, including provider-assessed measures. Systematic training for PrGA and validating virtual visit assessment of AJC is also a necessary next step to improve both documentation and validity data.