Neural Network Implementations for PCA and Its Extensions
Author(s) -
Jialin Qiu,
Hui Wang,
Jiabin Lu,
Biaobiao Zhang,
Ke-Lin Du
Publication year - 2012
Publication title -
isrn artificial intelligence
Language(s) - English
Resource type - Journals
eISSN - 2090-7443
pISSN - 2090-7435
DOI - 10.5402/2012/847305
Subject(s) - singular value decomposition , principal component analysis , linear discriminant analysis , artificial neural network , independent component analysis , pattern recognition (psychology) , computer science , singular spectrum analysis , eigenvalues and eigenvectors , signal processing , artificial intelligence , feature extraction , singular value , algorithm , digital signal processing , physics , quantum mechanics , computer hardware
Many information processing problems can be transformed into some form of eigenvalue or singular value problems. Eigenvalue decomposition (EVD) and singular value decomposition (SVD) are usually used for solving these problems. In this paper, we give an introduction to various neural network implementations and algorithms for principal component analysis (PCA) and its various extensions. PCA is a statistical method that is directly related to EVD and SVD. Minor component analysis (MCA) is a variant of PCA, which is useful for solving total least squares (TLSs) problems. The algorithms are typical unsupervised learning methods. Some other neural network models for feature extraction, such as localized methods, complex-domain methods, generalized EVD, and SVD, are also described. Topics associated with PCA, such as independent component analysis (ICA) and linear discriminant analysis (LDA), are mentioned in passing in the conclusion. These methods are useful in adaptive signal processing, blind signal separation (BSS), pattern recognition, and information compression.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom