How (not) to measure replication

European Journal for Philosophy of Science 11 (2):1-27 (2021)
  Copy   BIBTEX

Abstract

The replicability crisis refers to the apparent failures to replicate both important and typical positive experimental claims in psychological science and biomedicine, failures which have gained increasing attention in the past decade. In order to provide evidence that there is a replicability crisis in the first place, scientists have developed various measures of replication that help quantify or “count” whether one study replicates another. In this nontechnical essay, I critically examine five types of replication measures used in the landmark article “Estimating the reproducibility of psychological science” (Open Science Collaboration, Science, 349, ac4716, 2015) based on the following techniques: subjective assessment, null hypothesis significance testing, comparing effect sizes, comparing the original effect size with the replication confidence interval, and meta-analysis. The first four, I argue, remain unsatisfactory for a variety of conceptual or formal reasons, even taking into account various improvements. By contrast, at least one version of the meta-analytic measure does not suffer from these problems. It differs from the others in rejecting dichotomous conclusions, the assumption that one study replicates another or not simpliciter. I defend it from other recent criticisms, concluding however that it is not a panacea for all the multifarious problems that the crisis has highlighted.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 92,873

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Replicability and replication in the humanities.Rik Peels - 2019 - Research Integrity and Peer Review 4 (1).
An a priori solution to the replication crisis.David Trafimow - 2018 - Philosophical Psychology 31 (8):1188-1214.
The role of replication in psychological science.Samuel C. Fletcher - 2021 - European Journal for Philosophy of Science 11 (1):1-19.
Why Replication is Overrated.Uljana Feest - 2019 - Philosophy of Science 86 (5):895-905.
Should We Strive to Make Science Bias-Free? A Philosophical Assessment of the Reproducibility Crisis.Robert Hudson - 2021 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 52 (3):389-405.
What Is a Replication?Edouard Machery - 2020 - Philosophy of Science 87 (4):545-567.
Replicability of Experiment.John D. Norton - 2015 - Theoria: Revista de Teoría, Historia y Fundamentos de la Ciencia 30 (2):229.
Replication without replicators.Bence Nanay - 2011 - Synthese 179 (3):455-477.

Analytics

Added to PP
2021-06-05

Downloads
50 (#325,606)

6 months
12 (#242,256)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Samuel C. Fletcher
University of Minnesota

References found in this work

Philosophical explanations.Robert Nozick - 1981 - Cambridge: Harvard University Press.
Science, Policy, and the Value-Free Ideal.Heather Douglas - 2009 - University of Pittsburgh Press.
Philosophical Explanations.Robert Nozick - 1981 - Mind 93 (371):450-455.
Logic of Statistical Inference.Ian Hacking - 1965 - Cambridge, England: Cambridge University Press.

View all 20 references / Add more references