z-logo
Premium
Marginal screening of 2 × 2 tables in large‐scale case‐control studies
Author(s) -
McKeague Ian W.,
Qian Min
Publication year - 2019
Publication title -
biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.298
H-Index - 130
eISSN - 1541-0420
pISSN - 0006-341X
DOI - 10.1111/biom.12957
Subject(s) - bonferroni correction , permutation (music) , multiple comparisons problem , computer science , false discovery rate , statistical hypothesis testing , resampling , null hypothesis , statistical power , monte carlo method , type i and type ii errors , scale (ratio) , statistics , mathematics , data mining , artificial intelligence , biology , genetics , acoustics , gene , physics , quantum mechanics
Summary Assessing the statistical significance of risk factors when screening large numbers of 2 × 2 tables that cross‐classify disease status with each type of exposure poses a challenging multiple testing problem. The problem is especially acute in large‐scale genomic case‐control studies. We develop a potentially more powerful and computationally efficient approach (compared with existing methods, including Bonferroni and permutation testing) by taking into account the presence of complex dependencies between the 2 × 2 tables. Our approach gains its power by exploiting Monte Carlo simulation from the estimated null distribution of a maximally selected log‐odds ratio. We apply the method to case‐control data from a study of a large collection of genetic variants related to the risk of early onset stroke.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here