z-logo
open-access-imgOpen Access
Using Noise to Speed up Markov Chain Monte Carlo Estimation
Author(s) -
Brandon Franzke,
Bart Kosko
Publication year - 2015
Publication title -
procedia computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.334
H-Index - 76
ISSN - 1877-0509
DOI - 10.1016/j.procs.2015.07.285
Subject(s) - markov chain monte carlo , gibbs sampling , computer science , markov chain , metropolis–hastings algorithm , algorithm , rejection sampling , mathematics , monte carlo method , simulated annealing , parallel tempering , mathematical optimization , hybrid monte carlo , statistics , bayesian probability , artificial intelligence , machine learning
Carefully injected noise can speed the average convergence of Markov chain Monte Carlo (MCMC) simulation estimates. This includes the MCMC special cases of the Metropolis-Hastings algorithm and Gibbs sampling and simulated annealing. MCMC equates the solution to a computational problem with the equilibrium probability density of a reversible Markov chain. The algorithm must cycle through a long burn-in phase until it reaches equilibrium because the Markov samples are statistically correlated. The injected noise reduces this burn-in period. Simulations showed that optimal noise gave a 42% speed-up in finding the minimum potential energy of diatomic argon using a Lennard-Jones 12-6 potential. We prove that the Noisy MCMC algorithm brings each Markov step closer on average to equilibrium if an inequality holds between two expectations. Gaussian or Cauchy jump probabilities reduce the inequality to a simple quadratic condition

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom