z-logo
Premium
Neural network ensembles: combining multiple models for enhanced performance using a multistage approach
Author(s) -
Yang Shuang,
Browne Antony
Publication year - 2004
Publication title -
expert systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.365
H-Index - 38
eISSN - 1468-0394
pISSN - 0266-4720
DOI - 10.1111/j.1468-0394.2004.00285.x
Subject(s) - computer science , artificial neural network , classifier (uml) , artificial intelligence , machine learning , training set , generalization , ensemble learning , data mining , pattern recognition (psychology) , mathematics , mathematical analysis
Neural network ensembles (sometimes referred to as committees or classifier ensembles) are effective techniques to improve the generalization of a neural network system. Combining a set of neural network classifiers whose error distributions are diverse can generate better results than any single classifier. In this paper, some methods for creating ensembles are reviewed, including the following approaches: methods of selecting diverse training data from the original source data set, constructing different neural network models, selecting ensemble nets from ensemble candidates and combining ensemble members' results. In addition, new results on ensemble combination methods are reported.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here