Accepted Paper

An international consensus on core reproducibility checks in research: the OSIRIS Delphi study  
Rita Banzi (Mario Negri Institute) Constant Vinatier (Inserm, University of Rennes) Florian Naudet (Inserm, University of Rennes, CHU Rennes, Institut Universitaire de France) Monika Varga (Hungarian University of Agriculture and Life Sciences) Yuri Andrei Gelsleichter (Hungarian University of Agriculture and Life Sciences - MATE)

Short abstract

An international, multidisciplinary group of 82 experts in research reproducibility joined the OSIRIS Delphi study. After two online surveys and a consensus meeting, the group selected 32 checks considered important for research reproducibility and discussed adaptation of checks to various context.

Long abstract

Evidence-based solutions to improve reproducibility in research may help various stakeholders, such as researchers while planning their studies or reviewers during their evaluation. The Open Science to Increase Reproducibility in Science (OSIRIS) Delphi study aimed to develop a core set of items to be reviewed in research projects in multiple fields of sciences.

We invited a multidisciplinary group of experts in reproducibility to participate to two rounds of online survey to rate the importance of 44 reproducibility items prepared by the study Steering Committee. These items covered the whole life cycle of research project, from planning to dissemination. A third-round online consensus meeting was organized to vote on those items that did not reach consensus and discuss general themes.

Eighty-two participants from 21 countries responded to Round 1 (May-July 2024, response rate 91%) and 77 to Round 2 (September-October 2024, response rate 93%). Overall, the Delphi process led to the inclusion of 32 items, e.g., availability of data management and statistical plans, description of measures to mitigate bias, estimation of sample size, details on statistical analysis, software and code, findability and availability of datasets and codes. The group excluded six items (e.g., use of registered report, publication in open access journals) and did not reach consensus on seven items (e.g., pre-registration, description of authorship and contributorship).

These results suggest we have reached consensus on a set of reproducibility items considered important when conducting research. Strategies for implementing and adapting them across stages of research and scientific fields will be developed.

Panel T4.6
Where next for replication, transparency and analysis of QRPs? (II)
  Session 1 Tuesday 1 July, 2025, -