Premium
VARIATIONAL BAYESIAN ANALYSIS FOR HIDDEN MARKOV MODELS
Author(s) -
McGrory C. A.,
Titterington D. M.
Publication year - 2009
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/j.1467-842x.2009.00543.x
Subject(s) - deviance information criterion , deviance (statistics) , hidden markov model , mathematics , bayesian information criterion , model selection , bayesian inference , bayesian probability , feature selection , gaussian , inference , variable order bayesian network , markov model , algorithm , hidden semi markov model , mathematical optimization , markov chain , variable order markov model , machine learning , artificial intelligence , computer science , statistics , physics , quantum mechanics
Summary The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it also leads to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models with Gaussian observation densities illustrate this. If the variational algorithm is initialized with a large number of hidden states, redundant states are eliminated as the method converges to a solution, thereby leading to a selection of the number of hidden states. In addition, through the use of a variational approximation, the deviance information criterion for Bayesian model selection can be extended to the hidden Markov model framework. Calculation of the deviance information criterion provides a further tool for model selection, which can be used in conjunction with the variational approach.