z-logo
Premium
Estimating the prevalence of missing experiments in a neuroimaging meta‐analysis
Author(s) -
Samartsidis Pantelis,
Montagna Silvia,
Laird Angela R.,
Fox Peter T.,
Johnson Timothy D.,
Nichols Thomas E.
Publication year - 2020
Publication title -
research synthesis methods
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.376
H-Index - 35
eISSN - 1759-2887
pISSN - 1759-2879
DOI - 10.1002/jrsm.1448
Subject(s) - meta analysis , human connectome project , computer science , neuroimaging , missing data , contrast (vision) , publication bias , code (set theory) , interpretation (philosophy) , statistics , data mining , artificial intelligence , machine learning , medicine , psychology , pathology , mathematics , programming language , set (abstract data type) , neuroscience , psychiatry , functional connectivity
Coordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple functional magnetic resonance imaging experiments with the goal of obtaining results that are more likely to generalize. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publication bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modeling approach that allows us to estimate the prevalence of nonsignificant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported. The R code that we used is available at https://osf.io/ayhfv/.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here