Premium
On best rank one approximation of tensors
Author(s) -
Friedland S.,
Mehrmann V.,
Pajarola R.,
Suter S.K.
Publication year - 2013
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.1878
Subject(s) - singular value decomposition , rank (graph theory) , mathematics , computation , low rank approximation , singular value , variable (mathematics) , algorithm , pure mathematics , combinatorics , mathematical analysis , eigenvalues and eigenvectors , physics , quantum mechanics , tensor (intrinsic definition)
SUMMARY Today, compact and reduced data representations using low rank data approximation are common to represent high‐dimensional data sets in many application areas as for example, genomics, multimedia, quantum chemistry, social networks, or visualization. To produce such low rank data representations, the input data is typically approximated by the so‐called alternating least squares (ALS) algorithms. However, not all of these ALS algorithms are guaranteed to converge. To address this issue, we suggest a new algorithm for the computation of a best rank one approximation of tensors, called alternating singular value decomposition . This method is based on the computation of maximal singular values and the corresponding singular vectors of matrices. We also introduce a modification for this method and the ALS method, which ensures that alternating iterations will always converge to a semi‐maximal point (a critical point in several vector variables is semi‐maximal if it is maximal with respect to each vector variable, while other vector variables are kept fixed). We present several numerical examples that illustrate the computational performance of the new method in comparison to the ALS method. Copyright © 2013 John Wiley & Sons, Ltd.