We propose a '4R' approach to assessing reported research, underpinned by statistical rigour (see J. T. Leek and R. D. Peng Nature 520, 612; 2015). These 4Rs denote reproduction, replication, robustness and revelation.
Journals are aware of the need for the first two: whether enough information is available to reproduce an experiment, and whether its original results can be replicated. Even if the experiment can be reproduced, replication is often an issue, so journals are increasingly asking authors for details of software code and raw data. Videos of each experimental step could also be included.
Variations in experimental and analytical methods are a concern for referees and readers, hence the need for robustness. A well-conducted study should indicate the sensitivity of its conclusions to the various assumptions that were made in deriving them.
Revelation relates to the need for accountability and transparency. Scientists must communicate more effectively by disclosing their reasoning for how they develop strategies, derive insights and draw conclusions.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Pagan, A., Torgler, B. Use '4Rs' criteria to assess papers. Nature 522, 34 (2015). https://doi.org/10.1038/522034c
Published:
Issue Date:
DOI: https://doi.org/10.1038/522034c
This article is cited by
-
Peer review during demanding times: maintain rigorous standards
Scientometrics (2021)