Data Management: If it is deemed necessary to have external builders/contractors for the database, it is essential to keep an overview of data completeness and quality through diligent monitoring and reports. Dashboard views available to the steering committee aid in spotting problems as early as possible so that timely rectifications can be implemented. It may be possible to build in recruitment targets and use risk-based triggers for initiating monitoring actions to improve recruitment or increase data completeness. The registry platform should enable a fully automated electronic audit trail to allow traceability of all changes to data-entry made over time, including by whom.
Data Completion Reporting: We learned that the best option were automated and more frequent reports of data completion for registered patients. Also keep informing clinics that all data intervals were acceptable; that is, skipping a test period does not eliminate that subject from later analysis as long as baseline data had been collected. Strict oversight of missing data and attrition rates is paramount for the registry to enable identification of issues readily, implement mitigation and ensure delivery on its objectives.
Monitoring: A monitoring plan needs to be in place before data collection begins. It includes monitoring on a site-by-site basis at either all or select participating sites. Monitoring involves oversight of all administrative and regulatory aspects in participation in the registry. Keep in mind that applying compliance requirements normally associated with clinical investigations may overload a voluntary registry and cause failure to deliver. Monitoring, which can be done on site, remotely or a combination of both, should occur at regular intervals throughout the life of the registry [Gliklich 2020]. The budget impact of monitoring activities should not be underestimated.
Training: Upfront planning of an appropriate mode of training that can readily reach all contributors is key. Often this may require the availability of online training platforms and specific guidelines in different languages, as contributors will be widely distributed nationally and internationally. Local visits by sponsor representatives, as well as question-and-answer regional webinars will help to confirm the efficiency of data entry and follow up. Introduce the relevance of the registry along with the registry protocol and its procedures. Describe the outcomes and potential opportunities for their report and publication and why complete datasets are more powerful and remain essential to the goal of the registry [Zaletel 2015]. Regular newsletters, summaries of reports and publications, yearly investigator meetings and refresher training sessions can add to a sense of shared responsibility in the study group, and help increasing compliance to data provision and procedures. Training records should be collected to maintain oversight.
Personnel on and off-boarding: Registries running over a long timeframe will see staff changes at participating sites as well as with the sponsor. Clear processes for contributor on- and off-boarding are required, including creation and closure of accounts, onboarding training and respective documentation. A part of the off-boarding process must be a critical check of all necessary documentation (i.e. CVs, training logs) being filed as retrospective requests may not be possible.
Financial commitment: Registries may suffer from insufficient funding, especially if they run over a long period of time. Sponsoring a registry requires the far-reaching vision to see the project through delivery on its objectives and acknowledge the high costs involved [Berettini 2011]. It is, therefore, important to have regular progress reviews with the sponsors, ensuring the registry is delivering on their expectations [Gliklich 2020].
External and Internal Audits: Audits are concerned with checking the overall study conduct at the sponsor level, ensuring adequate study oversight and documentation according to the sponsor’s Standard Operating Procedures (SOP) which in turn should now be compliant with ISO14155:2020. They are therefore more associated with quality assurance [Ravi 2018, Maddock 2007]. These audits ultimately required much more oversight than anticipated for IROS. Internal audits are voluntary; initiated by the sponsor and eventually conducted by a qualified consultant. Audits are valuable in informing the sponsor of potential compliance weaknesses at any given time, allow addressing them in a timely manner. This can prevent more drastic actions as a result of an unannounced (external) audit, for example, by a Notified Body.
Data Analysis
Statistics: Statistical analysis should be planned and described in the planning phase. Keep in mind that a long-term registry acquires large amounts of data. This potentially allows stratification by patient profile. Registries offer an opportunity to mine the data for analysis of different subgroups, given adequate statistical numbers are accrued. It is, however, essential to first address primary goals of the registry, while being open to learn from interesting and valuable new information that may be revealed. Non-inferential data ‘analytics’ can be useful, but high-quality statistical analyses should be the main aim. We note that this can be time-consuming and expensive.
Measure of Success: These are concerned with being able to make projections as to how the data will unfold and further guide the planning of monitoring activities. As progress is checked against specifications of the protocol (goals), it is important to interact with each centre to address deficiencies and data problems and to then make adjustments promptly. For Cochlear, the measure of success was primarily the capture and report of sufficient patient-related longitudinal outcome data from a broad population of implant recipients to represent benefits in the real world. This enabled further analysis and interpretation of treatment benefits for the various patient subgroups involved.
Missing data: It is not possible to mandate data entry for a commercial patient-registry; therefore, it is wise to anticipate and account for a greater attrition rate than in rigorous clinical trials [Gomes 2017]. This can be factored in when considering the end points of the registry. Rigorous statistical treatment of data sets including enrolled individuals that are followed-up versus those that are lost to follow-up should be performed (Lenarz 2017, James 2021). It also is prudent to consider accounting for aspects that may influence or skew interpretation of the data in a voluntary registry such as how a registrant was recruited; the guidance provided to subjects on the completion methods of forms and questionnaires; and by whom and timing of evaluations.
Data Access: In addition to the statistical analysis strategy driven by the registry sponsors, contributors need to have access to their own data in a comprehensive form and format [Gliklich 2020]. This may include raw data access in a compatible form that can be used with common statistical packages. However, our experience revealed that when requested, provision of data reports that investigators could utilize in their own locally generated reports and presentations was greatly appreciated.
Closing the registry
Closure timing: A registry should have a defined lifespan, with trigger-points for reappraisal of the needs and conditions for continuation. In the case of IROS, we chose to close the registry for a number of reasons, including changes in regulations including related documentation requirements; a decline in patient recruitment and follow-up; and changing requirements for PMCF.
Database closure: Clinics need to be informed of the pending closure of the database in good time. This is somewhat opposite to the general rule for planned closure of clinical studies where all subjects would be required to complete evaluations according to the clinical protocol. Initially, the database should be frozen to allow for data queries to be answered before the database is closed.
End-of-study report: It is good practice to produce a final report which describes and analyses all the data collected according to the original registry plan. In some regulatory environments the registry may have been granted approval as a clinical study and, thus, the required Clinical Investigation Report (CIR) will need to be filed within a prescribed frame (e.g. one year) after closure. Notifications and reports to approving Ethical Committees will be required at study closure.
Publication: Some aspects of the registry data may be of more interest than others for publication; however, like clinical studies, publishing the primary and secondary outcomes from the registry for all or select patient groups as far as possible is advisable, both for transparency and to share the learnings and knowledge gained. In addition, summary reports describing the registry findings may need to be completed where the registries have been listed on public clinical trial portals.
Outcomes
Alongside considerable learning about the conduct of a company sponsored registry, the IROS registry was considered a success, achieving its goal in providing a view to real world benefits from hearing implant treatment.
The IROS registry succeeded in enrolling a large cohort of users of an implantable hearing solution, in particular CIs, who provided longitudinal real-world data. These data were utilized at various intervals to generate a wide range of local and international conference posters or presentations as well as articles and publications in peer-reviewed journals. Scientific findings for CIs from the IROS registry have been published in several papers [Czerniejewska-Wolska 2015, Lenarz 2017, Wyss 2019, Müller 2021, James 2021]. Pooled data could identify outcome trends as well as provide input for the design for more robust comparative studies.
Furthermore, it serves as an example of a large prospective observational study design yielding meta-analysis possibilities. The available data from adult CI-users was used to investigate the potential to reduce the number of SSQ49 questions administered while maintaining the required sensitivity to changes in self-reported hearing benefits over time. Statistical comparison was performed to compare the outcomes from the administered long version SSQ49, to the extracted subset of 12 questions for the SSQ12 [Noble 2013]. The analysis confirmed clinical equivalence between the long and short version questionnaires and thus a potential to save response time [Wyss 2019]. Consequently, the SSQ12 is now used standardly by Cochlear for studies and evaluation in the field of implantable hearing solutions