z-logo
Premium
Dictionary Learning and Non‐Asymptotic Bounds for Geometric Multi‐Resolution Analysis
Author(s) -
Maggioni Mauro,
Minsker Stanislav,
Strawn Nate
Publication year - 2014
Publication title -
pamm
Language(s) - English
Resource type - Journals
ISSN - 1617-7061
DOI - 10.1002/pamm.201410486
Subject(s) - estimator , robustness (evolution) , manifold (fluid mechanics) , intrinsic dimension , computer science , dimension (graph theory) , algorithm , sparse approximation , representation (politics) , mathematics , geometric data analysis , dictionary learning , nonlinear dimensionality reduction , artificial intelligence , dimensionality reduction , pure mathematics , statistics , politics , law , mechanical engineering , biochemistry , chemistry , political science , engineering , gene , curse of dimensionality
Data sets in high‐dimensional spaces are often concentrated near low‐dimensional sets. Geometric Multi‐Resolution Analysis (Allard, Chen, Maggioni, 2012) was introduced as a method for approximating (in a robust, multiscale fashion) a low‐dimensional set around which data may concentrated and also providing dictionary for sparse representation of the data. Moreover, the procedure is very computationally efficient. We introduce an estimator for low‐dimensional sets supporting the data constructed from the GMRA approximations. We exhibit (near optimal) finite sample bounds on its performance, and demonstrate the robustness of this estimator with respect to noise and model error. In particular, our results imply that, if the data is supported on a low‐dimensional manifold, the proposed sparse representations result in an error which depends only on the intrinsic dimension of the manifold. (© 2014 Wiley‐VCH Verlag GmbH & Co. KGaA, Weinheim)

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here