z-logo
Premium
Reduced rank ridge regression and its kernel extensions
Author(s) -
Mukherjee Ashin,
Zhu Ji
Publication year - 2011
Publication title -
statistical analysis and data mining: the asa data science journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.381
H-Index - 33
eISSN - 1932-1872
pISSN - 1932-1864
DOI - 10.1002/sam.10138
Subject(s) - reproducing kernel hilbert space , rank (graph theory) , mathematics , kernel (algebra) , multivariate statistics , kernel method , linear regression , ridge , algorithm , computer science , statistics , hilbert space , artificial intelligence , support vector machine , combinatorics , mathematical analysis , paleontology , biology
In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 2011

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here