Premium
Generalization of Jeffreys divergence‐based priors for Bayesian hypothesis testing
Author(s) -
Bayarri M. J.,
GarcíaDonato G.
Publication year - 2008
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/j.1467-9868.2008.00667.x
Subject(s) - prior probability , markov chain monte carlo , divergence (linguistics) , consistency (knowledge bases) , bayes factor , bayesian probability , mathematics , generalization , bayes' theorem , model selection , econometrics , statistics , computer science , discrete mathematics , mathematical analysis , philosophy , linguistics
Summary. We introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence‐based (DB) priors. DB priors have simple forms and desirable properties like information (finite sample) consistency and are often similar to other existing proposals like intrinsic priors. Moreover, in normal linear model scenarios, they reproduce the Jeffreys–Zellner–Siow priors exactly. Most importantly, in challenging scenarios such as irregular models and mixture models, DB priors are well defined and very reasonable, whereas alternative proposals are not. We derive approximations to the DB priors as well as Markov chain Monte Carlo and asymptotic expressions for the associated Bayes factors.