z-logo
Premium
Nonlinearly preconditioned L‐BFGS as an acceleration mechanism for alternating least squares with application to tensor decomposition
Author(s) -
De Sterck Hans,
Howse Alexander J.M.
Publication year - 2018
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.2202
Subject(s) - broyden–fletcher–goldfarb–shanno algorithm , nonlinear system , mathematics , tensor (intrinsic definition) , acceleration , nonlinear conjugate gradient method , robustness (evolution) , conjugate gradient method , preconditioner , quasi newton method , mathematical optimization , iterative method , computer science , newton's method , geometry , gradient descent , artificial intelligence , computer network , biochemistry , chemistry , physics , asynchronous communication , classical mechanics , quantum mechanics , artificial neural network , gene
Summary We derive nonlinear acceleration methods based on the limited‐memory Broyden–Fletcher–Goldfarb–Shanno (L‐BFGS) update formula for accelerating iterative optimization methods of alternating least squares (ALS) type applied to canonical polyadic and Tucker tensor decompositions. Our approach starts from linear preconditioning ideas that use linear transformations encoded by matrix multiplications and extends these ideas to the case of genuinely nonlinear preconditioning, where the preconditioning operation involves fully nonlinear transformations. As such, the ALS‐type iterations are used as fully nonlinear preconditioners for L‐BFGS, or equivalently, L‐BFGS is used as a nonlinear accelerator for ALS. Numerical results show that the resulting methods perform much better than either stand‐alone L‐BFGS or stand‐alone ALS, offering substantial improvements in terms of time to solution and robustness over state‐of‐the‐art methods for large and noisy tensor problems, including previously described acceleration methods based on nonlinear conjugate gradients and the nonlinear generalized minimal residual method. Our approach provides a general L‐BFGS‐based acceleration mechanism for nonlinear optimization.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here