z-logo
Premium
The effect of reduction in cross‐validation intervals on the performance of multifactor dimensionality reduction
Author(s) -
Motsinger Alison A.,
Ritchie Marylyn D.
Publication year - 2006
Publication title -
genetic epidemiology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.301
H-Index - 98
eISSN - 1098-2272
pISSN - 0741-0395
DOI - 10.1002/gepi.20166
Subject(s) - dimensionality reduction , reduction (mathematics) , multifactor dimensionality reduction , statistics , computer science , mathematics , econometrics , biology , artificial intelligence , genetics , genotype , geometry , single nucleotide polymorphism , gene
Multifactor Dimensionality Reduction (MDR) was developed to detect genetic polymorphisms that present an increased risk of disease. Cross‐validation (CV) is an important part of the MDR algorithm, as it prevents over‐fitting and allows the predictive ability of a model to be evaluated. CV is a computationally intensive step in the MDR algorithm. Traditionally, MDR has been implemented using 10‐fold CV. In order to reduce computation time and therefore allow MDR analysis to be applied to larger datasets, we evaluated the possibility of eliminating or reducing the number of CV intervals used for analysis. We found that eliminating CV made final model selection impossible, but that reducing the number of CV intervals from ten to five caused no loss of power, thereby reducing the computation time of the algorithm by half. The validity of this reduction was confirmed with data from an Alzheimer's disease (AD) study. Genet. Epidemiol . 2006. © 2006 Wiley‐Liss, Inc.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here