z-logo
Premium
Low‐rank approximation of tensors via sparse optimization
Author(s) -
Wang Xiaofei,
Navasca Carmeliza
Publication year - 2018
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.2136
Subject(s) - tensor (intrinsic definition) , mathematics , regularization (linguistics) , rank (graph theory) , computation , approximation algorithm , low rank approximation , mathematical optimization , optimization problem , probabilistic logic , consistency (knowledge bases) , convergence (economics) , algorithm , computer science , combinatorics , discrete mathematics , artificial intelligence , pure mathematics , statistics , economics , economic growth
Summary The goal of this paper is to find a low‐rank approximation for a given n th tensor. Specifically, we give a computable strategy on calculating the rank of a given tensor, based on approximating the solution to an NP‐hard problem. In this paper, we formulate a sparse optimization problem via an l 1 ‐regularization to find a low‐rank approximation of tensors. To solve this sparse optimization problem, we propose a rescaling algorithm of the proximal alternating minimization and study the theoretical convergence of this algorithm. Furthermore, we discuss the probabilistic consistency of the sparsity result and suggest a way to choose the regularization parameter for practical computation. In the simulation experiments, the performance of our algorithm supports that our method provides an efficient estimate on the number of rank‐one tensor components in a given tensor. Moreover, this algorithm is also applied to surveillance videos for low‐rank approximation.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here