Premium
Prediction Error Property of the Lasso Estimator and its Generalization
Author(s) -
Huang Fuchun
Publication year - 2003
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/1467-842x.00277
Subject(s) - mathematics , estimator , mean squared error , ordinary least squares , lasso (programming language) , bias of an estimator , minimum variance unbiased estimator , generalization , consistent estimator , statistics , efficient estimator , stein's unbiased risk estimate , james–stein estimator , invariant estimator , computer science , mathematical analysis , world wide web
The lasso procedure is an estimator‐shrinkage and variable selection method. This paper shows that there always exists an interval of tuning parameter values such that the corresponding mean squared prediction error for the lasso estimator is smaller than for the ordinary least squares estimator. For an estimator satisfying some condition such as unbiasedness, the paper defines the corresponding generalized lasso estimator. Its mean squared prediction error is shown to be smaller than that of the estimator for values of the tuning parameter in some interval. This implies that all unbiased estimators are not admissible. Simulation results for five models support the theoretical results.