Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Single-case experimental designs are rapidly growing in popularity. This popularity needs to be accompanied by transparent and well-justified methodological and statistical decisions. Appropriate experimental design including randomization, proper data handling and adequate reporting are needed to ensure reproducibility and internal validity. The degree of generalizability can be assessed through replication.
Bayesian optimization is a promising approach towards a more environmentally friendly chemical synthesis, in line with the Sustainable Development Goals. It can aid chemists to explore vast chemical spaces and find green reaction conditions with few experiments, decreasing resource consumption and waste generation while reducing discovery timelines and costs.
To improve early-stage research in the field of RNA lipid nanoparticles, there are several best practices to be considered for the collection, interpretation and reporting of characterization data.
To ensure a sustainable future and combat food scarcity, we must boost agricultural productivity, improve climate resilience and optimize resource usage. There is untapped potential for dense wireless sensor networks in agriculture that can increase yields and support resilient production when linked to smart decision and control systems.
New nanomaterials are being developed for efficient biomolecule delivery to plants. However, detection and quantification of plant cell entry are challenging and currently rely on subjective methods that lack proper controls. The necessary considerations of performing nanoparticle-mediated delivery in plants and how to accurately quantify delivery efficiency are discussed.
Laboratory hardware is often custom made or significantly modified. To improve reproducibility, it is imperative that these novel instruments are properly documented. Increasing adoption of open source hardware practices can potentially improve this situation. This article explores how open licences and open development methodologies enable custom instrumentation to be reproduced, scrutinized and properly recorded.
Logic diagrams are employed in electrical engineering for visualizing switching circuits. However, their utility and applicability extend far beyond the technical sciences. Here, we argue that natural and social scientists alike should consider using logic diagrams in their research. For certain analytical problems, logic diagrams are a perfect fit.
Data analysis relies heavily on computation, and algorithms have grown more demanding in terms of hardware and energy. Monitoring their environmental impacts is and will continue to be an essential part of sustainable research. Here, we provide guidance on how to do so accurately and with limited overheads.
Designing technology for point-of-care use in low- and middle-income countries requires understanding of the underlying barriers that contribute to recalcitrant global problems. The only way to understand those barriers is to work with local experts, otherwise you may wind up solving the wrong problem.
Black box machine learning models can be dangerous for high-stakes decisions. They rely on untrustworthy databases, and their predictions are difficult to troubleshoot, explain and error check for real-time predictions. Their use leads to serious ethics and accountability issues.
Lokwani et al. discuss the necessary considerations when performing spectral cytometry on highly autofluorescent samples to extract phenotypic information from autofluorescence spectra and perform accurate quantification of fluorescent labels.
Laboratories have a large environmental impact, with high levels of resource consumption and waste generation. In this article, the author discusses some of the actionable strategies that can bring real and impactful improvements, encompassing education, community engagement and the adoption of best practices by researchers. Building a global culture of sustainability in science will be crucial to reducing the carbon footprint of laboratories.
The scaling up of a chemical reaction is a complex process. Chemists should pay special attention to a number of key factors, including the choice of route, reagents and solvents; health and safety considerations; the isolation and purification of the desired product; and the development of robust supporting analytical methodology.
X-ray induced structural damage is well known, but the potential for changes in the kinetics of physical and chemical processes is rarely recognized or considered. These can happen over a wide intensity range, are difficult to predict and often escape detection. The problem deserves more attention from experimentalists.
COVID-19 has resulted in long-term effects on science and research. The way in which we carry out research has had to rapidly adapt as a result of the pressures placed on scientists, leading to the development of innovative approaches to research.
A large sample size, or N, increases the sensitivity of an experiment to detect differences between treatment groups. However, the biological entity that N refers to may not be obvious. Defining the wrong entity can inflate the sample size and increase both false-positive and false-negative results.
Orphan drug development is a rapidly expanding field. Nevertheless, clinical trials for rare diseases can present inherent challenges. Optimal study design and partnerships between academia and industry are therefore required for the successful development, delivery and clinical approval of effective therapies in this group of disorders.
Automated single-particle picking in electron cryo-microscopy data has seen important advances in the past couple of years and now enables computer-assisted particle selection even for challenging datasets. These advances have implications for streamlined and automated image processing, with potential benefits for improving the resolution of resulting structures.
Ensuring reproducibility and replicability has been an issue in many scientific disciplines in the past decade. Here, we discuss another ‘R’ that has not gotten enough airtime — reanalysis. We cover how open science and a focus on enabling reanalysis also make the goals of reproducibility and replicability easier to achieve.