z-logo
Premium
Preconditioning for classical relationships: a note relating ridge regression and OLS p ‐values to preconditioned sparse penalized regression
Author(s) -
Rohe Karl
Publication year - 2015
Publication title -
stat
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.61
H-Index - 18
ISSN - 2049-1573
DOI - 10.1002/sta4.86
Subject(s) - mathematics , preconditioner , lasso (programming language) , ordinary least squares , orthonormal basis , design matrix , combinatorics , algorithm , linear regression , statistics , computer science , iterative method , physics , quantum mechanics , world wide web
When the design matrix has orthonormal columns, “soft thresholding” the ordinary least squares solution produces the Lasso solution. If one uses the Puffer preconditioned Lasso, then this result generalizes from orthonormal designs to full rank designs (Theorem 1). Theorem 2 refines the Puffer preconditioner to make the Lasso select the same model as removing the elements of the ordinary least squares solution with the largest p ‐values. Using a generalized Puffer preconditioner, Theorem 3 relates ridge regression to the preconditioned Lasso; this result is for the high‐dimensional setting, p > n . Where the standard Lasso is akin to forward selection, Theorems 1, 2, and 3 suggest that the preconditioned Lasso is more akin to backward elimination. These results hold for sparse penalties beyond; for a broad class of sparse and non‐convex techniques (e.g. SCAD and MC+), the results hold for all local minima. Copyright © 2015 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here