Research Library

open-access-imgOpen AccessCalibrated One Round Federated Learning with Bayesian Inference in the Predictive Space
Author(s)
Mohsin Hasan,
Guojun Zhang,
Kaiyang Guo,
Xi Chen,
Pascal Poupart
Publication year2024
Federated Learning (FL) involves training a model over a dataset distributedamong clients, with the constraint that each client's dataset is localized andpossibly heterogeneous. In FL, small and noisy datasets are common,highlighting the need for well-calibrated models that represent the uncertaintyof predictions. The closest FL techniques to achieving such goals are theBayesian FL methods which collect parameter samples from local posteriors, andaggregate them to approximate the global posterior. To improve scalability forlarger models, one common Bayesian approach is to approximate the globalpredictive posterior by multiplying local predictive posteriors. In this work,we demonstrate that this method gives systematically overconfident predictions,and we remedy this by proposing $\beta$-Predictive Bayes, a Bayesian FLalgorithm that interpolates between a mixture and product of the predictiveposteriors, using a tunable parameter $\beta$. This parameter is tuned toimprove the global ensemble's calibration, before it is distilled to a singlemodel. Our method is evaluated on a variety of regression and classificationdatasets to demonstrate its superiority in calibration to other baselines, evenas data heterogeneity increases. Code available athttps://github.com/hasanmohsin/betaPredBayesFL
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here