Premium
A survey of publication practices of single‐case design researchers when treatments have small or large effects
Author(s) -
Shadish William R.,
Zelinsky Nicole A. M.,
Vevea Jack L.,
Kratochwill Thomas R.
Publication year - 2016
Publication title -
journal of applied behavior analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.1
H-Index - 76
eISSN - 1938-3703
pISSN - 0021-8855
DOI - 10.1002/jaba.308
Subject(s) - publication bias , psychology , clinical study design , medline , research design , medicine , meta analysis , social science , clinical trial , political science , sociology , law , pathology
The published literature often underrepresents studies that do not find evidence for a treatment effect; this is often called publication bias . Literature reviews that fail to include such studies may overestimate the size of an effect. Only a few studies have examined publication bias in single‐case design ( SCD ) research, but those studies suggest that publication bias may occur. This study surveyed SCD researchers about publication preferences in response to simulated SCD results that show a range of small to large effects. Results suggest that SCD researchers are more likely to submit manuscripts that show large effects for publication and are more likely to recommend acceptance of manuscripts that show large effects when they act as a reviewer. A nontrivial minority of SCD researchers (4% to 15%) would drop 1 or 2 cases from the study if the effect size is small and then submit for publication. This article ends with a discussion of implications for publication practices in SCD research.