z-logo
Premium
Regression by L 1 regularization of smart contrasts and sums (ROSCAS) beats PLS and elastic net in latent variable model
Author(s) -
ter Braak Cajo J. F.
Publication year - 2009
Publication title -
journal of chemometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.47
H-Index - 92
eISSN - 1099-128X
pISSN - 0886-9383
DOI - 10.1002/cem.1213
Subject(s) - elastic net regularization , latent variable , partial least squares regression , mathematics , lasso (programming language) , regression , regression analysis , regularization (linguistics) , statistics , linear regression , variable (mathematics) , computer science , artificial intelligence , mathematical analysis , world wide web
This paper proposes a regression method, ROSCAS, which regularizes smart contrasts and sums of regression coefficients by an L 1 penalty. The contrasts and sums are based on the sample correlation matrix of the predictors and are suggested by a latent variable regression model. The contrasts express the idea that a priori correlated predictors should have similar coefficients. The method has excellent predictive performance in situations, where there are groups of predictors with each group representing an independent feature that influences the response. In particular, when the groups differ in size, ROSCAS can outperform LASSO, elastic net, partial least squares (PLS) and ridge regression by a factor of two or three in terms of mean squared error. In other simulation setups and on real data, ROSCAS performs competitively. Copyright © 2009 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here