
ProBoost: Reducing Uncertainty using a Boosting Method for Probabilistic Models
Author(s) -
Fabio Mendonca,
Sheikh Shanawaz Mostafa,
Fernando Morgado-Dias,
Antonio G. Ravelo-Garcia,
Mario A. T. Figueiredo
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3592797
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Uncertainty analysis of classification or regression models is a key feature of probabilistic approaches to supervised learning, allowing the assessment of how trustworthy predictions are. Just as boosting algorithms aim at obtaining accurate ensembles of simple classifiers, by using a process guided by the accuracy of each of these classifiers, the method proposed in this paper builds an ensemble guided by the uncertainty of each of its individual models. The proposed method, named ProBoost (probabilistic boosting), uses the epistemic uncertainty of each training sample to determine those about which each model is most uncertain; the importance of these samples is then increased for the next learner, producing a sequence that progressively focuses on samples found to have the highest uncertainty. In the end, the learned models are combined into an ensemble. Three methods are proposed to update the importance of the samples according to the uncertainty estimates at each stage: undersampling, oversampling, and weighting. Furthermore, two approaches are studied regarding the final ensemble combination. The learners herein considered are standard convolutional neural networks, and the probabilistic models underlying the uncertainty estimation use either variational inference or Monte Carlo dropout. The experimental evaluation carried out on MNIST benchmark datasets shows that ProBoost yields significant performance improvement, compared to not using ProBoost, and outperforms a wider single model with a similar number of parameters.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom