We propose a '4R' approach to assessing reported research, underpinned by statistical rigour (see J. T. Leek and R. D. Peng Nature 520, 612; 2015). These 4Rs denote reproduction, replication, robustness and revelation.

Journals are aware of the need for the first two: whether enough information is available to reproduce an experiment, and whether its original results can be replicated. Even if the experiment can be reproduced, replication is often an issue, so journals are increasingly asking authors for details of software code and raw data. Videos of each experimental step could also be included.

Variations in experimental and analytical methods are a concern for referees and readers, hence the need for robustness. A well-conducted study should indicate the sensitivity of its conclusions to the various assumptions that were made in deriving them.

Revelation relates to the need for accountability and transparency. Scientists must communicate more effectively by disclosing their reasoning for how they develop strategies, derive insights and draw conclusions.