Research Library

open-access-imgOpen AccessFederated Learning with Instance-Dependent Noisy Label
Author(s)
Lei Wang,
Jieming Bian,
Jie Xu
Publication year2024
Federated learning (FL) with noisy labels poses a significant challenge.Existing methods designed for handling noisy labels in centralized learningtend to lose their effectiveness in the FL setting, mainly due to the smalldataset size and the heterogeneity of client data. While some attempts havebeen made to tackle FL with noisy labels, they primarily focused on scenariosinvolving class-conditional noise. In this paper, we study the more challengingand practical issue of instance-dependent noise (IDN) in FL. We introduce anovel algorithm called FedBeat (Federated Learning with BayesianEnsemble-Assisted Transition Matrix Estimation). FedBeat aims to build a globalstatistically consistent classifier using the IDN transition matrix (IDNTM),which encompasses three synergistic steps: (1) A federated data extraction stepthat constructs a weak global model and extracts high-confidence data using aBayesian model ensemble method. (2) A federated transition matrix estimationstep in which clients collaboratively train an IDNTM estimation network basedon the extracted data. (3) A federated classifier correction step that enhancesthe global model's performance by training it using a loss function tailoredfor noisy labels, leveraging the IDNTM. Experiments conducted on CIFAR-10 andSVHN verify that the proposed method significantly outperforms state-of-the-artmethods.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here