Reproducibility crisis

From Bioblast
Jump to: navigation, search
Bioblasts - Richard Altmann and MiPArt by Odra Noel
MitoPedia     Terms and abbreviations     Preprints and history     MiP and biochemistry     Concepts and methods     MitoPedia: SUIT     MitoPedia: O2k



MitoPedia

Reproducibility crisis

Description

An experiment or study is reproducible or replicable when subsequent experiments confirm the results. This is re-search. However, we can define different types of reproducibility depending on the conditions that we use to replicate the previous work or in the information available. Our aim is to focus mostly on two different kinds1:

1. Direct: Is when we obtaining the same results using the same experimental conditions, materials, and methods as described in the original experiment. This would be the ideal reproducibility of an experiment however, it requires a very accurate description of how the original experiment was performed. Some journals are trying to resolve the reproducibility crisis improving the rigor and the excellence on the reported methods and results (e.g. STAR Methods in Cell Press).

2. Systematical: Refers to obtaining the same results, but under different conditions; for example, using another cell line or mouse strain, or inhibiting a gene pharmacologically instead of genetically. This open the door to subsequent studies to find the conditions under which an initial finding holds.


Reference:

1. Stanley E Lazic (2016) Experimental design for laboratory biologists. Maximizing information and improving reproducibility. Cambridge University Press.
2. Baker M (2016) 1,500 scientists lift the lid on reproducibility. Survey sheds light on the ‘crisis’ rocking research. Nature 533:452–4.
But, is there a reproducibility crisis? According to a survey conducted by Nature2 of 1,576 researchers, "52 % agree that there is a significant 'crisis' of reproducibility, less than 31 % think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature" (Baker 2016). Chemistry and biology are the subjects with the highest share of failed attempts of reproduction of results.
Asking the researchers for the causes of this inability to reproduce published results, the top three answers are:
  • Selective reporting
  • Publication pressure
  • Low statistical power and poor analysis
The top three mentioned countermeasures are:
  • Better understanding of statistics
  • Better mentoring and supervision
  • More robust design


Solve the reproducibility crisis

While it is probably impossible to fully prevent human self-deception and inadequate command of statistical methods, what we can do is minimize sources of error connected to the instrumental equipment and its handling:
  • Select instrumental equipment for which appropriate specifications are available.
  • Have yourself trained on your equipment and make sure you know what you (both, you and the device you operate) are doing in each step of your experiment.
  • Avoid black-box performance of software.
  • Same for data analysis: get trained on analysis software. In the best case, use software that comes with your instrument in order to minimize errors during data transfer and translation.
  • An Open Access policy fosters the establishment of an error culture and a culture of transparence in science. In this way, Open Access - as manifested in the Bioblast website (see Gentle Science - contributes to solving the reproducibility crisis.
  • Methods: Identify the methods, apparatus (manufacturer's name and address in parentheses), and procedures in sufficient detail to allow other workers to reproduce the results. Give references to established methods. - Quoted from International Committee of Medical Journal Editors.


Further links

  • Validation by THE SCIENCE EXCHANGE NETWORK
» Validating key experimental results via independent replication
» Reproducibility Initiative

References

Bioblast linkReferenceYear
Begley CG, Ioannidis JPA (2015) Reproducibility in science: improving the standard for basic and preclinical research. Circ Res 116:116-26.2015
Chiu K, Grundy Q, Bero L (2017) `Spin' in published biomedical literature: A methodological systematic review. PLoS Biology 15(9): e2002173.2017
Gnaiger E (2019) Editorial: A vision on preprints for mitochondrial physiology and bioenergetics. MitoFit Preprint Arch doi:10.26124/mitofit:190002.v2.2019-04-24
Gnaiger Erich et al ― MitoEAGLE Task Group (2020) Mitochondrial physiology. Bioenerg Commun 2020.1:44 pp. doi:10.26124/bec:2020-0001.v12020
Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2005 Aug;2(8):e124.2005
Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R (2014) Increasing value and reducing waste in research design, conduct, and analysis. Lancet 383:166-75.2014
Kahneman D (2011) Thinking, fast and slow. Penguin Books 499 pp.2011
Stodden Victoria, Seiler Jennifer, Ma Zhaokun (2020) An empirical analysis of journal policy effectiveness for computational reproducibility. Proc Natl Acad Sci U S A 115:2584-9.2020
Triggle Chris R, Triggle David J (2017) From Gutenberg to Open Science: an unfulfilled odyssey. Drug Dev Res 78:3-23.2017


MitoPedia concepts: MitoFit Quality Control System 


MitoPedia topics: Gentle Science