z-logo
Premium
Multilinear and nonlinear generalizations of partial least squares: an overview of recent advances
Author(s) -
Zhao Qibin,
Zhang Liqing,
Cichocki Andrzej
Publication year - 2014
Publication title -
wiley interdisciplinary reviews: data mining and knowledge discovery
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.506
H-Index - 47
eISSN - 1942-4795
pISSN - 1942-4787
DOI - 10.1002/widm.1120
Subject(s) - multilinear map , partial least squares regression , canonical correlation , extension (predicate logic) , latent variable , subspace topology , multivariate statistics , mathematics , linear subspace , tensor (intrinsic definition) , set (abstract data type) , artificial intelligence , nonlinear system , computer science , algorithm , statistics , pure mathematics , physics , quantum mechanics , programming language
Partial least squares ( PLS ) is an efficient multivariate statistical regression technique that has shown to be particularly useful for analysis of highly collinear data. To predict response variables Y based independent variables X , PLS attempts to find a set of common orthogonal latent variables by projecting both X and Y onto a new subspace respectively. As an increasing interest in multi‐way analysis, the extension to multilinear regression model is also developed with the aim to analyzing two‐multidimensional tensor data. In this article, we overview the PLS ‐related methods including linear, multilinear, and nonlinear variants and discuss the strength of the algorithms. As canonical correlation analysis ( CCA ) is another similar technique with the aim to extract the most correlated latent components between two datasets, we also briefly discuss the extension of CCA to tensor space. Finally, several examples are given to compare these methods with respect to the regression and classification techniques. This article is categorized under: Technologies > Machine Learning Technologies > Prediction

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here