Wrapping Boosters against Noise
Author(s) -
Bernhard Pfahringer,
Geoffrey Holmes,
Gabi Schmidberger
Publication year - 2001
Publication title -
lecture notes in computer science
Language(s) - English
Resource type - Book series
SCImago Journal Rank - 0.249
H-Index - 400
eISSN - 1611-3349
pISSN - 0302-9743
ISBN - 3-540-42960-3
DOI - 10.1007/3-540-45656-2_35
Subject(s) - boosting (machine learning) , computer science , ensemble learning , noise (video) , quadratic equation , variance (accounting) , artificial intelligence , stacking , algorithm , machine learning , variance reduction , noise reduction , reduction (mathematics) , mathematics , physics , geometry , accounting , nuclear magnetic resonance , business , image (mathematics)
Wrappers have recently been used to obtain parameter optimizations for learning algorithms. In this paper we investigate the use of a wrapper for estimating the correct number of boosting ensembles in the presence of class noise. Contrary to the naive approach that would be quadratic in the number of boosting iterations, the incremental algorithm described is linear.
Additionally, directly using the k-sized ensembles generated during k-fold cross-validation search for prediction usually results in further improvements in classification performance. This improvement can be attributed to the reduction of variance due to averaging k ensembles instead of using only one ensemble. Consequently, cross-validation in the way we use it here, termed wrapping, can be viewed as yet another ensemble learner similar in spirit to bagging but also somewhat related to stacking
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom