Dictionary Learning Based on Nonnegative Matrix Factorization Using Parallel Coordinate Descent
Author(s) -
Zunyi Tang,
Shuxue Ding,
Zhenni Li,
Linlin Jiang
Publication year - 2013
Publication title -
abstract and applied analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.228
H-Index - 56
eISSN - 1687-0409
pISSN - 1085-3375
DOI - 10.1155/2013/259863
Subject(s) - coordinate descent , non negative matrix factorization , k svd , dictionary learning , sparse approximation , gradient descent , algorithm , representation (politics) , constraint (computer aided design) , computer science , matrix decomposition , mathematics , matrix (chemical analysis) , pattern recognition (psychology) , artificial intelligence , artificial neural network , quantum mechanics , politics , eigenvalues and eigenvectors , physics , geometry , political science , law , materials science , composite material
Sparse representation of signals via an overcomplete dictionary has recently received much attention as it has produced promising results in various applications. Since the nonnegativities of the signals and the dictionary are required in some applications, for example, multispectral data analysis, the conventional dictionary learning methods imposed simply with nonnegativity may become inapplicable. In this paper, we propose a novel method for learning a nonnegative, overcomplete dictionary for such a case. This is accomplished by posing the sparse representation of nonnegative signals as a problem of nonnegative matrix factorization (NMF) with a sparsity constraint. By employing the coordinate descent strategy for optimization and extending it to multivariable case for processing in parallel, we develop a so-called parallel coordinate descent dictionary learning (PCDDL) algorithm, which is structured by iteratively solving the two optimal problems, the learning process of the dictionary and the estimating process of the coefficients for constructing the signals. Numerical experiments demonstrate that the proposed algorithm performs better than the conventional nonnegative K-SVD (NN-KSVD) algorithm and several other algorithms for comparison. What is more, its computational consumption is remarkably lower than that of the compared algorithms
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom