Premium
Adaptive Posterior Mode Estimation of a Sparse Sequence for Model Selection
Author(s) -
SARDY SYLVAIN
Publication year - 2009
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/j.1467-9469.2009.00654.x
Subject(s) - mathematics , model selection , lasso (programming language) , estimator , hyperparameter , smoothing , parametric model , parametric statistics , gaussian , algorithm , mathematical optimization , statistics , computer science , physics , quantum mechanics , world wide web
Abstract. For the problem of estimating a sparse sequence of coefficients of a parametric or non‐parametric generalized linear model, posterior mode estimation with a Subbotin( λ , ν ) prior achieves thresholding and therefore model selection when ν ∈ [0,1] for a class of likelihood functions. The proposed estimator also offers a continuum between the (forward/backward) best subset estimator ( ν = 0 ), its approximate convexification called lasso ( ν = 1 ) and ridge regression ( ν = 2 ). Rather than fixing ν , selecting the two hyperparameters λ and ν adds flexibility for a better fit, provided both are well selected from the data. Considering first the canonical Gaussian model, we generalize the Stein unbiased risk estimate, SURE( λ , ν ), to the situation where the thresholding function is not almost differentiable (i.e. ν 1 ). We then propose a more general selection of λ and ν by deriving an information criterion that can be employed for instance for the lasso or wavelet smoothing. We investigate some asymptotic properties in parametric and non‐parametric settings. Simulations and applications to real data show excellent performance.