
AdamB: Decoupled Bayes by Backprop With Gaussian Scale Mixture Prior
Author(s) -
Keigo Nishida,
Makoto Taiji
Publication year - 2022
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2022.3203484
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Overfitting of neural networks to training data is one of the most significant problems in machine learning. Bayesian neural networks (BNNs) are known to be robust against overfitting owing to their ability to model parameter uncertainty. Bayes by Backprop (BBB), a simple variational inference approach that optimizes variational parameters by backpropagation, has been proposed to train BNNs. However, many studies have encountered challenges in terms of variational inference for large-scale models, such as deep learning. Thus, this study proposed Adam with decoupled Bayes by Backprop (AdamB) to stabilize the training of BNNs through the application of the Adam estimator evaluation to the gradient of the neural network. The proposed approach stabilized the noisy gradient of the BBB and mitigated excess changes in the parameters. In addition, AdamB combined with a Gaussian scale mixture as a prior distribution can suppress the intrinsic increase in variational parameters. The proposed AdamB exhibited superior stability compared to training using Adam with vanilla BBB. Further, the covariate shift benchmark using image classification tasks indicated the higher reliability of AdamB than deep ensembles in the case of noise-type covariate shifts. The considerations for stable learning of BNNs by AdamB shown in image classification tasks are expected to be important insights for application to other domains.