Premium
An Objective Bayesian Criterion to Determine Model Prior Probabilities
Author(s) -
Villa Cristiano,
Walker Stephen
Publication year - 2015
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/sjos.12145
Subject(s) - prior probability , model selection , mathematics , bayesian probability , selection (genetic algorithm) , divergence (linguistics) , kullback–leibler divergence , parametric model , measure (data warehouse) , parametric statistics , bayesian inference , econometrics , statistics , computer science , artificial intelligence , data mining , linguistics , philosophy
We discuss the problem of selecting among alternative parametric models within the Bayesian framework. For model selection problems, which involve non‐nested models, the common objective choice of a prior on the model space is the uniform distribution. The same applies to situations where the models are nested. It is our contention that assigning equal prior probability to each model is over simplistic. Consequently, we introduce a novel approach to objectively determine model prior probabilities, conditionally, on the choice of priors for the parameters of the models. The idea is based on the notion of the worth of having each model within the selection process. At the heart of the procedure is the measure of this worth using the Kullback–Leibler divergence between densities from different models.