z-logo
Premium
Two‐level preconditioning for Ridge Regression
Author(s) -
Tavernier Joris,
Simm Jaak,
Meerbergen Karl,
Moreau Yves
Publication year - 2021
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.2371
Subject(s) - preconditioner , conjugate gradient method , mathematics , covariance matrix , cholesky decomposition , eigenvalues and eigenvectors , principal component regression , cluster analysis , algorithm , matrix (chemical analysis) , linear regression , iterative method , mathematical optimization , statistics , physics , materials science , quantum mechanics , composite material
Solving linear systems is often the computational bottleneck in real‐life problems. Iterative solvers are the only option due to the complexity of direct algorithms or because the system matrix is not explicitly known. Here, we develop a two‐level preconditioner for regularized least squares linear systems involving a feature or data matrix. Variants of this linear system may appear in machine learning applications, such as ridge regression, logistic regression, support vector machines and Bayesian regression. We use clustering algorithms to create a coarser level that preserves the principal components of the covariance or Gram matrix. This coarser level approximates the dominant eigenvectors and is used to build a subspace preconditioner accelerating the Conjugate Gradient method. We observed speed‐ups for artificial and real‐life data.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here