z-logo
open-access-imgOpen Access
Gaussian Approximations for Probability Measures on $R^d$
Author(s) -
Yulong Lu,
Andrew M. Stuart,
Hendrik Weber
Publication year - 2017
Publication title -
siam/asa journal on uncertainty quantification
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.094
H-Index - 29
ISSN - 2166-2525
DOI - 10.1137/16m1105384
Subject(s) - mathematics , measure (data warehouse) , probability measure , randomness , limit (mathematics) , divergence (linguistics) , kullback–leibler divergence , gaussian , frequentist inference , gaussian noise , bayesian probability , mathematical analysis , statistics , bayesian inference , algorithm , computer science , physics , linguistics , philosophy , quantum mechanics , database
This paper concerns the approximation of probability measures on R^d with respect to the Kullback-Leibler divergence. Given an admissible target measure, we show the existence of the best approximation, with respect to this divergence, from certain sets of Gaussian measures and Gaussian mixtures. The asymptotic behavior of such best approximations is then studied in the small parameter limit where the measure concentrates; this asympotic behavior is characterized using Γ-convergence. The theory developed is then applied to understand the frequentist consistency of Bayesian inverse problems in finite dimensions. For a fixed realization of additive observational noise, we show the asymptotic normality of the posterior measure in the small noise limit. Taking into account the randomness of the noise, we prove a Bernstein-Von Mises type result for the posterior measure.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom