Premium
Use of model reparametrization to improve variational Bayes †
Author(s) -
Tan Linda S. L.
Publication year - 2021
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/rssb.12399
Subject(s) - bayes' theorem , gaussian , inference , mathematics , affine transformation , posterior probability , invertible matrix , convergence (economics) , transformation (genetics) , mathematical optimization , computer science , statistical physics , artificial intelligence , bayesian probability , physics , statistics , biochemistry , chemistry , quantum mechanics , pure mathematics , economics , gene , economic growth
Abstract We propose using model reparametrization to improve variational Bayes inference for hierarchical models whose variables can be classified as global (shared across observations) or local (observation‐specific). Posterior dependence between local and global variables is minimized by applying an invertible affine transformation on the local variables. The functional form of this transformation is deduced by approximating the posterior distribution of each local variable conditional on the global variables by a Gaussian density via a second order Taylor expansion. Variational Bayes inference for the reparametrized model is then obtained using stochastic approximation. Our approach can be readily extended to large datasets via a divide and recombine strategy. Using generalized linear mixed models, we demonstrate that reparametrized variational Bayes (RVB) provides improvements in both accuracy and convergence rate compared to state of the art Gaussian variational approximation methods.