z-logo
open-access-imgOpen Access
Adaptive stochastic gradient descent on the Grassmannian for robust low‐rank subspace recovery
Author(s) -
He Jun,
Zhang Yue,
Zhou Yuan,
Zhang Lei
Publication year - 2016
Publication title -
iet signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.384
H-Index - 42
ISSN - 1751-9683
DOI - 10.1049/iet-spr.2016.0049
Subject(s) - grassmannian , subspace topology , stochastic gradient descent , rank (graph theory) , gradient descent , computer science , mathematics , artificial intelligence , algorithm , pattern recognition (psychology) , combinatorics , artificial neural network
In this study, the authors present GASG21 (Grassmannian adaptive stochastic gradient for L 2,1 norm minimisation), an adaptive stochastic gradient algorithm to robustly recover the low‐rank subspace from a large matrix. In the presence of column outliers corruption, the authors reformulate the classical matrix L 2,1 norm minimisation problem as its stochastic programming counterpart. For each observed data vector, the low‐rank subspace S is updated by taking a gradient step along the geodesic of Grassmannian. In order to accelerate the convergence rate of the stochastic gradient method, the authors choose to adaptively tune the constant step‐size by leveraging the consecutive gradients. Numerical experiments on synthetic data and the extended Yale face dataset demonstrate the efficiency and accuracy of the proposed GASG21 algorithm even with heavy column outliers corruption.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here