z-logo
Premium
A limited‐memory acceleration strategy for MCMC sampling in hierarchical Bayesian calibration of hydrological models
Author(s) -
Kuczera George,
Kavetski Dmitri,
Renard Benjamin,
Thyer Mark
Publication year - 2010
Publication title -
water resources research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.863
H-Index - 217
eISSN - 1944-7973
pISSN - 0043-1397
DOI - 10.1029/2009wr008985
Subject(s) - markov chain monte carlo , gibbs sampling , bayesian probability , computer science , bayesian inference , metropolis–hastings algorithm , sampling (signal processing) , calibration , reversible jump markov chain monte carlo , algorithm , monte carlo method , mathematical optimization , statistics , mathematics , artificial intelligence , filter (signal processing) , computer vision
Hydrological calibration and prediction using conceptual models is affected by forcing/response data uncertainty and structural model error. The Bayesian Total Error Analysis methodology uses a hierarchical representation of individual sources of uncertainty. However, it is shown that standard multiblock “Metropolis‐within‐Gibbs” Markov chain Monte Carlo (MCMC) samplers commonly used in Bayesian hierarchical inference are exceedingly computationally expensive when applied to hydrologic models, which use recursive numerical solutions of coupled nonlinear differential equations to describe the evolution of catchment states such as soil and groundwater storages. This note develops a “limited‐memory” algorithm for accelerating multiblock MCMC sampling from the posterior distributions of such models using low‐dimensional jump distributions. The new algorithm exploits the decaying memory of hydrological systems to provide accurate tolerance‐based approximations of traditional “full‐memory” MCMC methods and is orders of magnitude more efficient than the latter.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here