z-logo
Premium
Iterative Bias Correction of the Cross‐Validation Criterion
Author(s) -
YANAGIHARA HIROKAZU,
FUJISAWA HIRONORI
Publication year - 2012
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/j.1467-9469.2011.00754.x
Subject(s) - mathematics , jackknife resampling , bayesian information criterion , estimator , bias of an estimator , cross validation , statistics , minimum variance unbiased estimator
.  The cross‐validation (CV) criterion is known to be asecond‐order unbiased estimator of the risk function measuring the discrepancy between the candidate model and the true model, as well as the generalized information criterion (GIC) and the extended information criterion (EIC). In the present article, we show that the 2k th‐order unbiased estimator can be obtained using a linear combination from the leave‐one‐out CV criterion to the leave‐ k ‐out CV criterion. The proposed scheme is unique in that a bias smaller than that of a jackknife method can be obtained without any analytic calculation, that is, it is not necessary to obtain the explicit form of several terms in an asymptotic expansion of the bias. Furthermore, the proposed criterion can be regarded as a finite correction of a bias‐corrected CV criterion by using scalar coefficients in a bias‐corrected EIC obtained by the bootstrap iteration.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here