
A Brief Guide to Evaluate Replications
Author(s) -
Etienne P. LeBel,
Wolf Vanpaemel,
Irene Cheung,
Lorne Campbell
Publication year - 2019
Publication title -
meta-psychology
Language(s) - English
Resource type - Journals
ISSN - 2003-2714
DOI - 10.15626/mp.2018.843
Subject(s) - replication (statistics) , transparency (behavior) , independence (probability theory) , computer science , similarity (geometry) , research design , psychology , statistics , artificial intelligence , mathematics , computer security , image (mathematics)
The importance of replication is becoming increasingly appreciated, however, considerably less consensus exists about how to evaluate the design and results of replications. We make concrete recommendations on how to evaluate replications with more nuance than what is typically done currently in the literature. We highlight six study characteristics that are crucial for evaluating replications: replication method similarity, replication differences, investigator independence, method/data transparency, analytic result reproducibility, and auxiliary hypotheses’ plausibility evidence. We also recommend a more nuanced approach to statistically interpret replication results at the individual-study and meta-analytic levels, and propose clearer language to communicate replication results.