Premium
Local convergence of alternating low‐rank optimization methods with overrelaxation
Author(s) -
Oseledets Ivan V.,
Rakhuba Maxim V.,
Uschmajew André
Publication year - 2023
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.2459
Subject(s) - mathematics , hessian matrix , positive definite matrix , rank (graph theory) , relaxation (psychology) , rate of convergence , convergence (economics) , linearization , quotient , matrix (chemical analysis) , semidefinite programming , mathematical analysis , mathematical optimization , combinatorics , nonlinear system , eigenvalues and eigenvectors , psychology , social psychology , channel (broadcasting) , physics , materials science , quantum mechanics , economic growth , electrical engineering , economics , composite material , engineering
The local convergence of alternating optimization methods with overrelaxation for low‐rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low‐rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite2 × 2 $$ 2\times 2 $$ block systems.