z-logo
Premium
Ensemble methods and partial least squares regression
Author(s) -
Mevik BjørnHelge,
Segtnan Vegard H.,
Næs Tormod
Publication year - 2004
Publication title -
journal of chemometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.47
H-Index - 92
eISSN - 1099-128X
pISSN - 0886-9383
DOI - 10.1002/cem.895
Subject(s) - overfitting , partial least squares regression , principal component regression , regression , multivariate statistics , principal component analysis , ensemble learning , robustness (evolution) , chemometrics , statistics , computer science , noise (video) , regression analysis , robust regression , bootstrap aggregating , linear regression , artificial intelligence , pattern recognition (psychology) , mathematics , machine learning , chemistry , artificial neural network , biochemistry , image (mathematics) , gene
Recently, there has been increased attention in the literature on the use of ensemble methods in multivariate regression and classification. These methods have been shown to have interesting properties for both regression and classification. In particular, they can improve the accuracy of unstable predictors. Ensemble methods have so far been little studied in situations that are common for calibration and prediction in chemistry, i.e. situations with a large number of collinear x‐variables and few samples. These situations are often approached by data compression methods such as principal component regression (PCR) or partial least squares regression (PLSR). The present paper is an investigation of the properties of different types of ensemble methods used with PLSR in situations with highly collinear x‐data. Bagging and data augmentation by simulated noise are studied. The focus is on the robustness of the calibrations. Real and simulated data are used. The results show that ensembles trained on data with added noise can make PLSR robust against the type of noise added. In particular, the effects of sample temperature variations can be eliminated. Bagging does not seem to give any improvement over PLSR for small and intermediate numbers of components. It is, however, less sensitive to overfitting. Copyright © 2005 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here