
Uncertainty-Aware Neighbor Calibration for Positive and Unlabeled Learning in Large Machine Learning Models
Author(s) -
Muhammad Imran Zulfiqar,
Ayesha Khalid,
Liming Chen,
Sahraoui Dhelim
Publication year - 2025
Publication title -
ieee transactions on emerging topics in computational intelligence
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.135
H-Index - 21
eISSN - 2471-285X
DOI - 10.1109/tetci.2025.3594047
Subject(s) - computing and processing
Positive and Unlabeled (PU) learning aims to build classifiers when only positive and unlabeled data are available. We propose a robust and scalable framework, Uncertainty-Aware Neighbor Calibration (UANC), for PU learning in large machine learning models. UANC integrates Monte Carlo (MC) Dropout-based uncertainty estimation with neighborhood calibration to refine predictions. Uncertain instances are dynamically reweighted during training using a principled loss adjustment strategy. Additionally, a Gaussian Mixture Model (GMM) clusters high-confidence samples with soft pseudo-labeling to mitigate label noise. Extensive experiments on benchmark datasets demonstrate UANC's superior performance under varying noise levels, including sensitivity to dropout rate, neighborhood size, and clustering initialization. The framework is computationally efficient and effective in real-world scenarios such as medical diagnosis and fraud detection.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom