z-logo
Premium
Efficient approximate k ‐fold and leave‐one‐out cross‐validation for ridge regression
Author(s) -
Meijer Rosa J.,
Goeman Jelle J.
Publication year - 2013
Publication title -
biometrical journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.108
H-Index - 63
eISSN - 1521-4036
pISSN - 0323-3847
DOI - 10.1002/bimj.201200088
Subject(s) - resampling , cross validation , computer science , algorithm , generalized linear model , mathematics , contrast (vision) , regression , linear regression , ridge , mathematical optimization , statistics , artificial intelligence , paleontology , biology
In model building and model evaluation, cross‐validation is a frequently used resampling method. Unfortunately, this method can be quite time consuming. In this article, we discuss an approximation method that is much faster and can be used in generalized linear models and Cox’ proportional hazards model with a ridge penalty term. Our approximation method is based on a Taylor expansion around the estimate of the full model. In this way, all cross‐validated estimates are approximated without refitting the model. The tuning parameter can now be chosen based on these approximations and can be optimized in less time. The method is most accurate when approximating leave‐one‐out cross‐validation results for large data sets which is originally the most computationally demanding situation. In order to demonstrate the method's performance, it will be applied to several microarray data sets. An R package penalized, which implements the method, is available on CRAN.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here