Do German University Medical Centres promote robust and transparent research? – A cross-sectional study of institutional policies.

DOI: https://doi.org/10.21203/rs.3.rs-871675/v1

Abstract

Background

In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions insufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimise their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are.

Methods

For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments, as well as websites for their core facilities and research in general. We screened the documents for mentions of indicators of robust and transparent research and for mentions of more traditional metrics of career progression.

Results

While Open Access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research (study registration; reporting of results; sharing of data, code, and protocols; and robustness) were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Regarding the traditional metrics, the institutional policies for academic degrees and academic appointments had frequent mentions of the number of publications, grant money, impact factors, and authorship order.

Conclusions

References to robust and transparent research practices are, with a few exceptions, generally uncommon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail.

Background

For more than a decade, the robustness and transparency of biomedical research practices have been discussed broadly and in an increasingly politicised manner (1,2). It has been pointed out that the field fundamentally lacks transparency, with study reports, research protocols, or participant data often not publicly accessible, and many research findings not being published at all (3). If findings are published, they often lack sufficient detail and suffer from selective reporting of outcomes (4). In addition, authors have pointed to flaws in biomedical study design and statistical analyses (5–7). These problems have been suggested to lead to the low reproducibility of biomedical research (2,8,9) and, more broadly, to contribute to research waste.

With the aim of “increasing value, reducing waste” (10), several solutions have been proposed to address the aforementioned issues. One of them is the call for more transparent, or “open,” science, i.e., reporting all research results in a timely manner and in line with established reporting guidelines; publishing without paywalls (open access); sharing data, code and protocols; and registering study protocols before data collection. In addition, new publication formats, namely, preprints and registered reports, have been established. All of these procedures are supposed to provide a less biased evidence base, to limit researchers’ degrees of freedom in the conduct and analysis of research and to allow other researchers to scrutinise the results, all of which increase trust in science (11). In addition, calls to increase the robustness of science have become more prevalent, for example, by asking and supporting researchers to choose adequately large samples, to appropriately randomise participants and to perform blinding of subjects, experimenters, and outcome assessors (2,8,12,13).

To date, the uptake of these practices has been slow (14–18). Many have pointed out that the current incentive structures at research institutions insufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimise their fitness in the struggle for publications and grants (19–22). To receive promotion and ultimately tenure, researchers are evaluated primarily based on how many journal articles (with high impact factors) they publish and how much grant money they secure (20). While it has been said that all stakeholder groups, including funders and journals, have to contribute (4,23) to an incentive system that actually does reward robust and transparent research practices, the role and influence of the research institutions has thus far been less prominently discussed (8).

Since research institutions define the requirements for academic degrees, academic appointments, and available intramural funding, their policies and regulations could actually have a strong impact on researchers’ capability, opportunity and motivation to apply robust and transparent research practices in their work. With regard to university policies, some changes have already been proposed, many of which focus on transparent practices. The signers of the San Francisco Declaration for Research Assessment (DORA) call for institutions to clearly highlight “that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published” (24). More specifically, Moher et al. (25) suggest that rewards, incentives and performance metrics at institutions should align with the full dissemination of research, reuse of original datasets, and more complete reporting, i.e., the sharing of protocols, code, and data, as well as preregistration of research (see also the publications by the League of European Research Universities (26) and others (4,27–30)). Mejlgaard et al. (31) propose that institutions should incentivise making data findable, accessible, interoperable, and reusable (FAIR) (32). Begley et al. (8) suggest similar rules for academic degrees and academic appointments but with regard to the robustness of the research. These authors also demand that the use of reporting guidelines, such as the ARRIVE guidelines (33) or the CONSORT guidelines (34), should be mandated by institutions.

In Germany, the German Research Foundation (DFG) recently reworked its overarching Good Scientific Practice Guidelines (35), which for the first time include regulations addressing the robustness, transparency, and reporting of research. These guidelines are not just recommendations; they have to be ratified by all universities that want to receive DFG funding until July 2022.

Additionally, core facilities such as clinical research units and animal research facilities provide centralised services for the conduct of clinical or animal studies (this includes animal protection and research according to the so-called “3R” principles: Replace, Reduce, Refine (36)). These core facilities could have additional influence (37), for example, by recommending that researchers report their results in a timely and nonselective way or by requiring researchers to adhere to established reporting guidelines.

Studying the uptake of the aforementioned recommendations in institutional policies could inform areas for improvement in policymaking at universities. To our knowledge, however, only one study (38) has dealt with this issue, sampling biomedical faculties of 170 universities worldwide and searching criteria for promotion and tenure. The authors report that mentions of traditional criteria of research evaluation were very frequent while mentions of robust and transparent research practices were seldom.

In this cross-sectional study, we aim to describe whether and how relevant policies of university medical centres (UMCs) in Germany support the robust and transparent conduct of research and how prevalent traditional metrics of career progression are. We choose to investigate only German UMCs, as they are currently in the process of implementing the requirements by the DFG. This focus on Germany also ensures better comparability of the institutions, as different countries have different regulatory environments, different curricula for medical studies, and different frameworks for postgraduate degrees. The focus on Germany also allows us to perform an in-depth data collection of German-language documents.

Methods

A detailed methodology is described in our preregistered study protocol, which is available here: https://osf.io/wu69s/ (including a list of protocol amendments and deviations). The following section provides a summary of the methods, which are reported in accordance with the STROBE (39) guidelines.

Sampling and Search Strategy

We obtained a list of all German medical faculties from the website of the German medical faculty council (’Medizinischer Fakultätentag’). For each of the 38 faculties (as of December 2020), we performed a manual search of their websites between 14 December 2020 and 12 February 2021. We searched both the websites of the medical faculties and the websites of the adjacent university hospitals, looking for the sources presented in Table 1.

Regarding the PhD and Habilitation regulations and the application forms and procedural guidelines for tenure, we saved all related policy documents. Regarding the websites of clinical research units, websites of animal research facilities, 3R centres and animal protection offices, and the general research websites, we first went through each website in detail (including all subpages), saving only those websites and documents that contained any mention of one of the indicators summarised in Table 2.

We chose the indicators based on their frequent discussion in the literature as well as their consistency with previous research works (38) and our own publications (18,22).

Data Extraction

All documents were imported into qualitative research software (MAXQDA 2020, Release 20.3.0, VERBI GmbH, Germany). One rater (MRH) went through all of the documents and coded whether there was any mention of the aforementioned indicators of robust and transparent science, as well as the traditional indicators of metrics for career progression. While we searched all documents for the indicators of robust and transparent science, we only searched the PhD and Habilitation regulations and application forms and procedural guidelines for tenure for the traditional metrics, as these related specifically to career progression.

If a certain indicator was found, the rater decided whether it was just mentioned (e.g., a university explaining what Open Access is, or a clinical research unit stating that 60% of clinical trial results were published) or whether that procedure was incentivised/required (e.g., a university specifically requiring a certain impact factor to receive top marks in the PhD or a clinical research unit offering support with summary results reporting clinical trials). In line with the COM-B model of behaviour change (40), we defined anything that increased either capability, opportunity, or motivation to engage in that behaviour as “incentivised.” A second, independent rater (AF) went through the documents of 10 of the 38 UMCs. The interrater reliability in that sample, measured by Cohen’s kappa, was κ = 0.806. Thus, we deemed further double-coding unnecessary. The code for these calculations, which also includes robustness checks, is available on GitHub (https://github.com/Martin-R-H/umc-policy-review). The datasets generated and analysed during the current study are available in a repository on the Open Science Framework, https://osf.io/4pzjg/ 

Results

Overall, the web searches of the 38 German UMCs yielded 339 documents. We found PhD regulations for 37 UMCs (97%), Habilitation regulations for 35 UMCs (92%), tenure application forms for 25 UMCs (66%), and procedural guidelines for tenure for 11 UMCs (29%). We found 38 general research websites (100%), 32 websites of clinical research units (84%), and 23 animal research websites (61%; see Table 3). Supplementary Table S2 shows numbers for each UMC.

The results are presented in detail in Tables 4a and 4b, divided by each procedure and each type of document or website. Supplementary Tables S3a and S3b provide more detailed data on the subcategories (see Supplementary Table S1) of the indicators of robust and transparent science.

 

Indicators of robust and transparent science

Study registration. The issue or relevance of registering studies was not mentioned in any of the documents regarding academic promotion and tenure. Thirty-four percent of websites of clinical research units mentioned registration, with 31% of those also incentivising or requiring the practice. This appeared mostly in the form of clinical research units offering support with registering clinical studies. Only a few animal research websites and general research websites also mentioned registration. The animal facility provided a link to an animal study register, while the two research webpages generally endorsed the practice.

Reporting of results. Only a few of the PhD regulations and Habilitation regulations mentioned the issue of results reporting; these mentions included general requirements that the respective thesis be published. The Habilitation regulation also referred to timely publication, asking individuals to publish their thesis at least two years after receiving the degree. Results reporting was also mentioned by a few clinical research units, animal research websites and general research websites. All mentions expressed general endorsements or highlighted education regarding the publication of all results. One of the clinical research units further offered help with the publication process. The one animal research facility that mentioned results reporting provided a tool to identify publication formats that fit the characteristics of the respective datasets. When the general research websites mentioned reporting results, they usually referred to statements in the university’s or the DFG’s Good Scientific Practice Guidelines to publish research.

Data/code/protocol sharing. Data, code, or protocol sharing was only mentioned in one PhD regulation. In this mention, supervisors were asked to consider data sharing in the evaluation of the thesis. No Habilitation regulations, tenure application forms or procedural guidelines for tenure mentioned this indicator. Likewise, no clinical research unit website mentioned sharing of data/protocols. One animal research website and eight research websites mentioned data, code, or protocol sharing. In the case of the animal facility, the mention was a general introduction to the FAIR principles (32) of data sharing. The general research websites included endorsements of data and code sharing, mostly within the university’s Good Scientific Practice Guidelines.

Open Access. Few PhD regulations and Habilitation requirements mentioned Open Access. In one paragraph of one PhD regulation, PhD supervisors were asked to also keep in mind whether the work was published Open Access. In the other cases, the PhD regulation mentioned that the university library has the right to publish the submitted thesis in a repository (Green Open Access). No clinical research unit and only one animal research website mentioned Open Access. In the case of the animal facility, it was a link to an interview in which an “Open Access culture” was announced. Thirteen general research websites mentioned Open Access; these websites either generally recommended Open Access, or referred to the university’s Open Access publishing funds.

Measures to improve robustness. Robustness was mentioned in one PhD regulation but in none of the Habilitation regulations, tenure application forms, or procedural guidelines for tenure. Robustness was mentioned by the majority of websites of clinical research units and some of the animal research websites. The clinical research units usually offered services to help with power calculations and randomisation (and, in a few cases, blinding). In the case of animal research websites, the mentions pointed to documents recommending power calculation as part of an effort to protect animals, courses on robust animal research, and general informational material on these issues. None of the general research webpages mentioned the issue of robustness.

 

Traditional indicators

Number of publications. References to publication numbers were made by all 37 PhD regulations and the majority of Habilitation regulations. Procedural guidelines for tenure had references to the number of publications in three cases. No tenure application documents referred to the number of publications, aside from requirements to provide a complete list of publications. The PhD regulations and Habilitation requirements listed a certain number of publications as a requirement to obtain a PhD or Habilitation, respectively.

Grant money. None of the PhD regulations mentioned grant money. Among the Habilitation regulations, four requirements mentioned grant money. The majority of the tenure application forms mentioned grant money, in which case there were requirements to provide a complete list of grants awarded. Three of the procedural guidelines for tenure regulations also mentioned grants. These passages stated that experience with grants was expected or that people were required to provide a list of grants they received.

Impact factor. Six of the PhD regulations and a large fraction of the Habilitation requirements mentioned an impact factor, with most of them establishing concrete incentives or requirements. These two types of regulations contained passages that asked doctoral students or Habilitation candidates to publish in high-impact journals to achieve the highest grade (summa cum laude) or regulations that allowed PhD students to publish only one paper instead of three if that paper was in a sufficiently “good” journal. Tenure application forms mentioned impact factors in most cases, mostly requiring applicants to provide a list of impact factors of each journal they published in. None of the procedural guidelines for tenure mentioned impact factors.

Authorship order. Almost all of the PhD regulations mentioned the authorship order, always as an incentive/requirement. The same applied to the majority of Habilitation regulations, all of which incentivised or required it. These were regulations requiring PhD students and Habilitation candidates to publish a portion of their articles as the first or last author (e.g., a very common regulation for German PhD students is to publish three papers, one of which with first/last authorship). A majority of tenure application forms also mentioned this requirement, noting that applicants should provide a list of publications divided by authorship. None of the procedural guidelines for tenure had a related section.

Discussion

In this study, we assessed how and to what extent the 38 German UMCs promote robust and transparent research in their publicly available institutional policies for academic degrees, academic appointments, core facilities, and research in general. Our results show that current UMC policies on academic degrees (e.g., PhD regulations) or appointments (e.g., tenure application forms) do not promote procedures for robust and transparent research, such as preregistration, results reporting, data/code/protocol sharing, or measures to improve robustness (e.g., sample size calculation, randomisation, blinding). Only Open Access was mentioned in six of 37 (16%) PhD regulations, in most cases referring to a repository to which the thesis can be publicly uploaded. This indicates that researchers’ individual performance in robust and transparent research is not currently explicitly promoted in obtaining a PhD or Habilitation degree or in securing appointments for professorships. In contrast, the number of publications and the authorship order play a dominant role in almost all UMC policies on academic degrees and appointments. The majority of tenure- and appointment-related policies further promote impact factors and secured grant money.

The UMCs’ websites for clinical and animal research included more frequent mentions of robust and transparent research, but these differed based on the type of website. Clinical research unit websites mentioned preregistration and robustness frequently, while animal research websites only had frequent mentions of robustness. These mentions of robustness were mostly related to sample size calculations and randomisation. The general research websites had the most frequent mentions of Open Access, results reporting, and data, code or protocol sharing. In most of these cases, these indicators were mentioned in the Good Scientific Practice Guidelines. In the case of Open Access, some websites also featured references to a university-wide Open Access publishing fund.

Our findings are in line with a similar study that collected data from an international sample (38). The authors found very frequent mentions of traditional criteria of research evaluation, while mentions of robust and transparent research practices were less frequent than in our study, with none of the documents mentioning publishing in Open Access mediums, registering research, or adhering to reporting guidelines and only one mentioning data sharing. The results are unsurprising, given recent findings that practices for robust and transparent research are only very slowly becoming more prevalent (16,18); however, they stand in stark contrast to the various experts and institutions that have called for institutions to align their promotion criteria with robust and transparent research (8,24–26,30,31,41,42). While we focused exclusively on a full sample of all German UMCs, our approach could also be applied to other countries.

It is important to keep in mind that policies and incentives constantly change. As mentioned in the introduction, a major German funder, the DFG, recently reworked their Good Scientific Practice Guidelines (35), expecting universities to ratify them by July 2022. For the first time, these guidelines state that measures to avoid bias in research, e.g., blinding, should be used and that researchers should document all information and generally should publish all results, including those that do not support the hypothesis. They also recommend open sharing of data and materials in accordance with the FAIR principles and suggest that authors consider alternative publication platforms, such as academic repositories. Some German UMCs might have already changed their internal Good Scientific Practice Guidelines by the time the data collection of this study was conducted.

One limitation of our study is that the raters were not blinded, which was not possible due to the ability to identify the policies from context. Another limitation is that we only searched for publicly available policies and did not survey relevant representatives of the 38 UMCs personally to identify further policies. Especially for the two types of tenure-related policies, we found relevant policies for only 66% (application forms) and 29% (procedural guidelines) of all UMCs. We refrained from this additional step, however, because the results across the available tenure policies showed a very homogenous pattern of no mentions (0%) of measures for robust and transparent research, and we assumed that this pattern is not different across policies that are not publicly available.

While our study focused on reviewing policies for robust and transparent research in policies for academic degrees and academic appointments, as well as their research and core facility websites, there are other ways for institutions to promote these practices. An example is the performance-based allocation of intramural resources, the so-called Leistungsorientierte Mittelvergabe (LOM). The LOM might also have a strong influence on researcher behaviour, and it has been proposed that it should be based on transparency of research (43). Another example would be education on robust and transparent research practices, which might be organised and announced via internal channels of a university and thus not visible for our web search-based methodology. Nevertheless, we are convinced that our approach was able to find policies that cover many institutional incentives, especially policies for promotion and tenure, which have strong influences on researcher behaviour.

Additionally, initiatives for transparent research exist at the federal and national levels (e.g., Project DEAL for Open Access). While universities remain obliged to include these national incentives and policies in their own regulations, future research might focus on these other incentives or policies in the biomedical field.

Future research should also address the effects of policies and other institutional activities to increase robust and transparent research practices (44). Thus far, only a few studies have addressed this. For example, Keyes et al.(45) evaluated the effect of a clinical trial registration and reporting programme, which turned out to be a success.

Conclusion

In summary, concrete policies and recommendations on robust and transparent research remain scarce at German UMCs, especially in terms of policies for academic degrees and academic appointments. This stands in stark contrast to the various experts and institutions that have called for institutions to align their promotion criteria with robust and transparent research.

Declarations

Ethics approval and consent to participate. Not applicable.

Consent for publication. Not applicable.

Availability of data and materials. The datasets generated and analysed during the current study are available in a repository on the Open Science Framework, https://osf.io/4pzjg/

Competing interests. All authors are affiliated with German university medical centres. They declare no further conflicts of interest.

Funding. This work was funded by the German Federal Ministry of Education and Research (BMBF 01PW18012). The funder had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Authors' contributions (CRediT statement). Martin Holst: Conceptualisation, Data curation, Formal analysis, Investigation, Methodology, Writing – original draft. Alice Faust: Formal analysis, Investigation, Writing – review & editing. Daniel Strech: Conceptualisation, Funding acquisition, Methodology, Project administration, Resources, Supervision, Writing – review & editing.

Acknowledgements. The authors would like to acknowledge Danielle Rice, Miriam Kip, Tim Neumann, Delwen Franzen, and Tamarinde Haven for their helpful comments on the first drafts of the protocol.

Abbreviations

DFG

Deutsche Forschungsgemeinschaft (German Research Foundation)

DORA

San Francisco Declaration for Research Assessment

FAIR

findable, accessible, interoperable, and reusable

LOM

Leistungsorientierte Mittelvergabe (performance based allocation of funding)

PhD

Doctor of Philosophy

UMC

University Medical Centre

3R

Replace, Reduce, Refine

References

1.         Grabitz P, Brückner T, Strech D. Deutsche Universitäten machen Ergebnisse klinischer Arzneimittelstudien unzureichend öffentlich – Das sollte sich ändern. Bundesgesundheitsblatt - Gesundheitsforschung - Gesundheitsschutz. 2020 Dec;63(12):1531–7. 

2.         Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011 Sep;10(9):712–712. 

3.         Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. Increasing value and reducing waste: addressing inaccessible research. The Lancet. 2014 Jan;383(9913):257–66. 

4.         Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. The Lancet. 2014 Jan;383(9913):267–76. 

5.         Ioannidis JPA. Why Most Published Research Findings Are False. PLoS Med. 2005;2(8):6. 

6.         Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, et al. Increasing value and reducing waste in research design, conduct, and analysis. The Lancet. 2014 Jan;383(9912):166–75. 

7.         Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde A, Sherratt N, et al. Risk of Bias in Reports of In Vivo Research: A Focus for Improvement. PLOS Biol. 2015 Oct 13;13(10):e1002273. 

8.         Begley CG, Buchan AM, Dirnagl U. Robust research: Institutions must do their part for reproducibility. Nature. 2015 Sep;525(7567):25–7. 

9.         Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. 2012 Mar;483(7391):531–3. 

10.       Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JPA, et al. Biomedical research: increasing value, reducing waste. The Lancet. 2014 Jan;383(9912):101–4. 

11.       Nosek BA, Spies JR, Motyl M. Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability. Perspect Psychol Sci. 2012 Nov;7(6):615–31. 

12.       Nature. Announcement: Towards greater reproducibility for life-sciences research in Nature. Nature. 2017 Jun;546(7656):8. 

13.       Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. The Experimental Design Assistant. PLOS Biol. 2017 Sep 28;15(9):e2003779. 

14.       Hobert A, Jahn N, Mayr P, Schmidt B, Taubert N. Open access uptake in Germany 2010–2018: adoption in a diverse research landscape. Scientometrics [Internet]. 2021 May 23 [cited 2021 Jun 3]; Available from: https://link.springer.com/10.1007/s11192-021-04002-0

15.       Keyes A, Mayo-Wilson E, Atri N, Lalji A, Nuamah PS, Tetteh O, et al. Time From Submission of Johns Hopkins University Trial Results to Posting on ClinicalTrials.gov. JAMA Intern Med. 2020 Feb 1;180(2):317. 

16.       Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. Dirnagl U, editor. PLOS Biol. 2018 Nov 20;16(11):e2006930. 

17.       Wieschowski S, Biernot S, Deutsch S, Glage S, Bleich A, Tolba R, et al. Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres. Lopes LC, editor. PLOS ONE. 2019 Nov 26;14(11):e0223758. 

18.       Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, Schürmann C, et al. Result dissemination from clinical trials conducted at German university medical centers was delayed and incomplete. J Clin Epidemiol. 2019 Nov 1;115:37–45. 

19.       Flier J. Faculty promotion must assess reproducibility. Nature. 2017 Sep;549(7671):133–133. 

20.       Higginson AD, Munafò MR. Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions. PLOS Biol. 2016 Nov 10;14(11):e2000995. 

21.       Smaldino PE, McElreath R. The natural selection of bad science. R Soc Open Sci. 2016 Aug 17;3:106384. 

22.       Strech D, Weissgerber T, Dirnagl U. Improving the trustworthiness, usefulness, and ethics of biomedical research through an innovative and comprehensive institutional initiative. PLOS Biol. 2020 Feb 11;18(2):e3000576. 

23.       Al-Shahi Salman R, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. Increasing value and reducing waste in biomedical research regulation and management. The Lancet. 2014 Jan;383(9912):176–85. 

24.       San Francisco Declaration on Research Assessment [Internet]. 2013 [cited 2021 Jul 26]. Available from: https://sfdora.org/read/

25.       Moher D, Glasziou P, Chalmers I, Nasser M, Bossuyt PMM, Korevaar DA, et al. Increasing value and reducing waste in biomedical research: who’s listening? The Lancet. 2016 Apr;387(10027):1573–86. 

26.       Lerouge I, Hol T. Towards a Research Integrity Culture at Universities: From Recommendations to Implementation [Internet]. 2020 Jan. Available from: https://www.leru.org/publications/towards-a-research-integrity-culture-at-universities-from-recommendations-to-implementation

27.       Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLOS Biol. 2018 Mar 29;16(3):e2004089. 

28.       Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLOS Biol. 2020 Jul 16;18(7):e3000737. 

29.       McKiernan EC. Imagining the “open” university: Sharing scholarship to improve research and education. PLOS Biol. 2017 Oct 24;15(10):e1002614. 

30.       Wissenschaftsrat. Perspektiven der Universitätsmedizin [Internet]. 2016 [cited 2021 Aug 3]. Available from: https://www.wissenschaftsrat.de/download/archiv/5663-16.pdf?__blob=publicationFile&v=1

31.       Mejlgaard N, Bouter LM, Gaskell G, Kavouras P, Allum N, Bendtsen A-K, et al. Research integrity: nine ways to move from talk to walk. Nature. 2020 Oct 15;586(7829):358–60. 

32.       Wilkinson MD, Dumontier M, Aalbersberg IjJ, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016 Dec;3(1):160018. 

33.       Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. Reporting animal research: Explanation and elaboration for the ARRIVE guidelines 2.0. Boutron I, editor. PLOS Biol. 2020 Jul 14;18(7):e3000411. 

34.       Schulz KF, Altman DG, Moher D. CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. BMJ [Internet]. 2010;340(:c332). Available from: https://www.doi.org/10.1136/bmj.c332

35.       Deutsche Forschungsgemeinschaft. Guidelines for Safeguarding Good Research Practice. Code of Conduct. 2019 Sep 15 [cited 2021 May 20]; Available from: https://zenodo.org/record/3923602

36.       Russell WMS, Burch RL. The principles of humane experimental technique. London: Methuen; 1959. 

37.       Kos-Braun IC, Gerlach B, Pitzer C. A survey of research quality in core facilities. eLife. 2020 Nov 26;9:e62212. 

38.       Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ. 2020 Jun 25;m2081. 

39.       Elm E von, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. The Lancet. 2007 Oct 20;370(9596):1453–7. 

40.       Michie S, van Stralen MM, West R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implement Sci IS. 2011 Apr 23;6:42. 

41.       Ioannidis JPA, Khoury MJ. Assessing Value in Biomedical Research: The PQRST of Appraisal and Reward. JAMA. 2014 Aug 6;312(5):483. 

42.       Deutsche Forschungsgemeinschaft. Replizierbarkeit von Ergebnissen in der Medizin und Biomedizin. Stellungnahme der Arbeitsgruppe „Qualität in der Klinischen Forschung“ der DFG-Senatskommission für Grundsatzfragen in der Klinischen Forschung [Internet]. DFG; 2018. Available from: https://www.dfg.de/download/pdf/dfg_im_profil/reden_stellungnahmen/2018/180507_stellungnahme_replizierbarkeit_sgkf.pdf

43.       Kip M, Bobrov E, Riedel N, Scheithauer H, Gazlig T, Dirnagl U. Einführung von Open Data als zusätzlicher Indikator für die Leistungsorientierte Mittelvergabe (LOM)-Forschung an der Charité – Universitätsmedizin Berlin. 2019;1. 

44.       Bouter L. What Research Institutions Can Do to Foster Research Integrity. Sci Eng Ethics. 2020 Aug;26(4):2363–9. 

45.       Keyes A, Mayo-Wilson E, Nuamah P, Lalji A, Tetteh O, Ford DE. Creating a Program to Support Registering and Reporting Clinical Trials at Johns Hopkins University. Acad Med. 2021 Apr;96(4):529–33. 

Tables

Table 1. Data Sources that were screened in this study.

Sources

(1)     PhD regulations (for every different type of PhD awarded by the medical faculty)

(2)     Habilitation regulations (the Habilitation is a German post-PhD degree that for a long time was and often still is a prerequisite for obtaining tenure or securing third-party funding)

(3)     Application forms for tenured professorships

(4)     Procedural guidelines for the tenure process (‘Berufungsordnungen’)

(5)     Websites of clinical research units

(6)     Animal research websites, including for animal facilities, 3R centres and animal protection offices

(7)     General research websites of the medical faculties or university hospitals


Table 2. Indicators that were chosen for inclusion in this study.

Indicators of robustness and transparency

 

Traditional metrics

(1)     Study registration in publicly accessible registries (e.g., clinicaltrials.gov, DRKS, Open Science Framework, BfR Animal Study Registry)

 

(1)    Number of publications

(2)     Reporting of results

 

(2)    Number and monetary value of awarded grants

(3)     Sharing of research data, code, or protocol

 

(3)    Impact factor of journals in which research has been published

(4)     Open Access

 

(4)    Authorship order

(5)     Measures to improve robustness

 

 


Table 3. Number of documents we included for each university and document type.


 

PhD Regulation

Habilitation Regulation

Tenure (Application Form)

Tenure (Procedu-ral Guidelines)

Website of Clinical Research Unit

Animal Research Website

General Re-search Website

UMCs with document or website found 1

 

37

(97%)

35

(92%)

25

(66%)

11

(29%)

32

(84%)

23

(61%)

38

(100%)

1 For the criteria for promotion and tenure, we counted every UMC with at least one policy found. For the core facility websites, we counted every UMC at which we at least found a website – even if we did not save any documents due to lack of mentions of any of the relevant procedures.
 

Table 4a. Number of university medical centres that mention indicators of robust and transparent science and traditional indicators of career progression in each of the included sources. The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required.

 

 

PhD Regulation

(n=37)

 

Habilitation Regulation

(n=35)

 

Tenure (Application Form)

(n=25)

 

Tenure (Procedural Guideline)

(n=11)

 

 

any mention

incentivised/required

 

any mention

incentivised/required

 

any mention

incentivised/required

 

any mention

incentivised/required

Indicators of robust/transparent science

 

 

 

 

 

 

 

 

 

 

 

 

Study Registration

 

0% (0)

0% (0)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

Reporting of Results

 

8% (3)

3% (1)

 

3% (1)

3% (1)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

Sharing of Data/Code/Protocol

 

3% (1)

3% (1)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

Open Access

 

16% (6)

14% (5)

 

3% (1)

0% (0)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

Robustness

 

3% (1)

3% (1)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

 

0% (0)

0% (0)

Traditional indicators of career progression

 

 

 

 

 

 

 

 

 

 

 

 

Number of Publications

 

100% (37)

100% (37)

 

91% (32)

80% (28)

 

0% (0)

0% (0)

 

27% (3)

9% (1)

Grant Money

 

0% (0)

0% (0)

 

11% (4)

3% (1)

 

84% (21)

0% (0)

 

27% (3)

9% (1)

Impact Factor

 

16% (6)

14% (5)

 

63% (24)

54% (19)

 

72% (18)

0% (0)

 

0% (0)

0% (0)

Authorship Order

 

97% (36)

97% (36)

 

80% (28)

80% (28)

 

68% (17)

0% (0)

 

0% (0)

0% (0)

 

Table 4b. Number of university medical centres that mention indicators of robust and transparent science in each of the included sources. The left column for each source indicates any mention of the indicator, and the right column for each source indicates indicators that are incentivised or required.

 

 

Clinical Research Units

(n=32)

 

Animal Research Websites

(n=23)

 

General Research Website (n=38)

 

 

any mention

incentivised/required

 

any mention

incentivised/required

 

any mention

incentivised/required

Indicators of robust/transparent science

 

 

 

 

 

 

 

 

 

Study Registration

 

34% (11)

31% (10)

 

4% (1)

4% (1)

 

5% (2)

3% (1)

Reporting of Results

 

9% (3)

3% (1)

 

4% (1)

0% (0)

 

21% (8)

11% (4)

Sharing of Data/Code/Protocol

 

0% (0)

0% (0)

 

4% (1)

0% (0)

 

21% (8)

11% (4)

Open Access

 

0% (0)

0% (0)

 

4% (1)

0% (0)

 

34% (13)

24% (9)

Robustness

 

81% (26)

75% (24)

 

26% (6)

17% (4)

 

0% (0)

0% (0)

 


Table 5a. Examples of mentions of each practice, divided by policy type. (Empty sections indicate that no section regarding the metric was found.)

 

 

PhD Regulation

 

Habilitation Requirement

 

Tenure (Application Form)

 

Tenure (Regulation)

Study Registration

 

---

 

 

---

 

---

 

 

---

 

Reporting of Results

 

"Doctoral theses must meet high quality standards; after peer review, the written doctoral work should be made accessible to the public and the scientific community as openly as possible through publication.”

 

"Accordingly, the following requirements arise: [...] 3. they should lead to a publication in a professional journal or to another kind of publication with a high scientific standard”

 

"According to the recommendations of the DFG (German Research Foundation) [...] the following general principles apply to good scientific practice: [...] Publication of results”

 

 

"The habilitation thesis, or at least its essential parts, are to be published by the habilitated person. The publication should take place within two years after awarding of the teaching qualification.”

 

---

 

 

---

 

Data/Code Sharing

 

“If possible, supervisors should address the following points: [...] publication of the full original dataset (e.g., via Figshare, Dryad) of all figures (graphs, tables, in-text data, etc.) in the article.”

 

 

---

 

---

 

 

---

 

Open Access

 

“If possible, supervisors should address the following points: [...] Open Access”

 

"The work can be published: [...] 2. as an electronic Open Access publication in the university repository operated by the university library.”

 

"The University Library shall be granted the right to make and distribute further copies of the dissertation as well as to make the dissertation publicly accessible in data networks within the scope of the legal duties of the University Library.“.”

 

 

"In the event of publication in accordance with sentence 3 no. 4, the university library shall be granted the right to produce and distribute further copies of the habilitation thesis within the scope of the university library's statutory duties, and to make the habilitation thesis publicly accessible in data networks.”

 

---

 

 

---

 

Robustness

 

“If possible, supervisors should address the following points: [...] Reduction of bias by appropriate measures (blinding, randomisation, a priori definition of inclusion and exclusion criteria, etc.), a priori power calculations.”

 

 

---

 

---

 

 

---

 

 

 

 

Table 5b. Examples for mentions of each practice, divided by policy type. (Empty sections indicate that no section regarding the metric was found.)

 

 

Clinical Research units

 

Animal Research Facilities

 

General Research Website

Study Registration

 

"We offer you the entry of your clinical study both prospectively and retrospectively, i.e., after the study has already started, or support with the entry by yourself.”

 

"The Center for Clinical Trials provides free access to clinicaltrials.gov for scientists of the <institution>. Via this access, investigator-initiated trials from the local area of responsibility can be entered in the registry.”

 

"Since the beginning of this year, all new clinical studies conducted at the University Medical Center have been centrally recorded in the Research Registry University Medicine <institution>”

 

"The analyses will be carried out according to statistical analysis plans that were already designed at the time of study planning”

 

 

“The Animal Study Registry provides a platform for pre-registration of an accurate study plan for animal experimental studies prior to the start of experiments, to avoid selective reporting. [...] <Institution> 3R sponsors first entries with 500 euros each for research”

 

"The <institution’s> 3Rs toolbox is structured along the lines of the <other institution’s> 6Rs model […], which adds robustness, registration, and reporting to replacement, reduction, and refinement.”

 

 

"When should a study project be listed in a public study registry? The International Committee of Medical Journal Editors (ICMJE) requires prospective registration in an ICMJE-recognised registry for all clinical trials to be published by one of the participating journals. The new version of the Declaration of Helsinki also requires this: Article 19 Every clinical trial shall be registered in a publicly accessible database before recruitment of the first subject. The Ethics Committee of the <institution> supports this requirement.”

 

"Recommendation for registration in the DRKS. The International Committee of Medical Journal Editors (ICMJE) requires prospective registration in an ICMJE-recognised registry for all clinical trials to be published by one of the participating journals. The current version of the Declaration of Helsinki also calls for this (Article 35: "Every research project involving human subjects shall be registered in a publicly accessible database before recruitment of the first subject.”). The Ethics Committee of the <institution> supports this call.”

 

Reporting of Results

 

"<Institution>: 92% of clinical studies published in EU database”

 

"The CRU at <institution> offers support with the following tasks, among others: [...] Support in writing publications”

 

"In principle, a publication of the results should be aimed at...”

 

 

“Fiddle is a tool developed by <institution> to combat publication bias. This ‘match-making’ tool helps researchers to identify alternative ways of publishing information from well-designed experiments, which is often difficult to publish in traditional journals (i.e., null or neutral results, datasets, etc.).”

 

 

“As a matter of principle, contributors to research projects are required to actively seek, or at least not refuse, publication of the results.”

 

"As a rule, scientists contribute all results to the scientific discourse. In individual cases, however, there may be reasons not to make results publicly available (in the narrower sense in the form of publications but also in the broader sense via other communication channels); this decision must not depend on third parties.”

 

"Findings that do not support the authors' hypothesis should also be reported.”

 

“Scientific results are to be communicated to the scientific public in the form of publications; the scientific publications are thus - like the scientific observation or the scientific experiment itself - the product of the work of scientists.”

 

"Rules for the publication of results: publication in principle of results obtained with public funds (principle of publicity of basic research), publication also of falsified hypotheses in an appropriate manner and admission of errors (principle of a scientific culture open to error).”

 

Data/Code Sharing

 

---

 

“The FAIR Guiding Principles for scientific data management and stewardship state that data must be Findable, Accessible, Interoperable, and Reusable (Wilkinson et al. 2016). These guidelines put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals.”

 

 

“The <city> Open Science programme of <city> has set itself the goal of making the research results and data of research institutions in <city>, which were created with funds from government research funding, freely accessible and easy to find together with other information on science in <city>.”

 

"Scientists at <university> are encouraged to publish and store raw research data that served as the basis for publications, together with the associated materials and information, in recognised open-access subject repositories in accordance with the FAIR principles (Findable, Accessible, Interoperable, Reusable), insofar as this is in conformity with the applicable legal provisions on data protection and copyright and with planned patent applications.”

 

"Self-programmed software is made publicly available with indication of the source code.”

 

 

Open Access

 

---

 

“<Institutions> will [...] establish an "Open Access" and "Open Data" culture [...].”

 

 

"To promote the OA publishing activities of scientists affiliated with the <university>, the Dean's Office of the Faculty of Medicine has been providing a publication fund from central funds since the beginning of 2020, from which the APCs for OA publications (original work, review articles) in journals with an impact factor (IF) above 10 are financed.”

 

"Publications in high-ranking open-access journals are supported by the faculty by covering 50% of the costs. Publications can be made free of charge with individual Open Access publishers.”

 

"Therefore, the Presidential Board recommends [...] to archive their scientific publications as a matter of principle as pre- or post-print on a subject repository or on the institutional repository of <university>, and/or to publish them in peer-reviewed open-access journals, and to reserve the right to publish or archive their research results electronically in publishing contracts, if possible. In doing so, the freedom of research and teaching is preserved. Discipline-specific practices and rights of use of publishers are to be taken into account in a differentiated manner.”

"For a scientific qualification work, e.g., cumulative dissertations or post-doctoral theses, which have appeared in a journal as a published article, permission for second publication must be obtained in any case“

 

Robustness

 

"Biometric study planning and analysis includes the following: [...] power calculation [...] implementation of randomisation“

 

"The biometric consultation hour is an internal service of the CRU for members of the hospital and the medical faculty of the <university>. The biometric consultation hour provides information on questions of study design, sample size planning and the choice of suitable evaluation methods.”

 

"Important tasks of biometrics in clinical trials are power calculation, randomisation”

 

"Online randomisation service“

 

"The course is aimed at postdocs and scientists who write and/or plan their own animal research applications. In particular, the course will consider what requirements a biometric report must meet in order to satisfy the authorities’ specifications and how this can be achieved. Special attention will be given to power analysis, study design, and sample size calculation.”

 

"Experimental animal science links and information resources: [...]

G*Power (freeware for the analysis of test power)”

 

"Reduction:

The number of test animals is reduced to a minimum. This is achieved by an optimised experimental design, in which it is clarified statistically and methodically in advance how many animals are necessary for an evaluable result.”

 

 

---