z-logo
open-access-imgOpen Access
An Additive Sparse Penalty for Variable Selection in High-Dimensional Linear Regression Model
Author(s) -
Sang-In Lee
Publication year - 2015
Publication title -
communications for statistical applications and methods
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.326
H-Index - 6
eISSN - 2383-4757
pISSN - 2287-7843
DOI - 10.5351/csam.2015.22.2.147
Subject(s) - feature selection , mathematics , linear regression , statistics , regression analysis , variable (mathematics) , regression , model selection , linear model , econometrics , computer science , artificial intelligence , mathematical analysis
We consider a sparse high-dimensional linear regression model. Penalized methods using LASSO or nonconvex penalties have been widely used for variable selection and estimation in high-dimensional regression models. In penalized regression, the selection and prediction performances depend on which penalty function is used. For example, it is known that LASSO has a good prediction performance but tends to select more variables than necessary. In this paper, we propose an additive sparse penalty for variable selection using a combination of LASSO and minimax concave penalties (MCP). The proposed penalty is designed for good properties of both LASSO and MCP. We develop an efficient algorithm to compute the proposed estimator by combining a concave convex procedure and coordinate descent algorithm. Numerical studies show that the proposed method has better selection and prediction performances compared to other penalized methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom