z-logo
Premium
A Generalized Bayes Rule for Prediction
Author(s) -
Corcuera José Manuel,
Giummolè Federica
Publication year - 1999
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/1467-9469.00149
Subject(s) - mathematics , bayes' theorem , naive bayes classifier , statistics , bayes' rule , econometrics , bayes factor , bayesian probability , artificial intelligence , computer science , support vector machine
In the case of prior knowledge about the unknown parameter, the Bayesian predictive density coincides with the Bayes estimator for the true density in the sense of the Kullback‐Leibler divergence, but this is no longer true if we consider another loss function. In this paper we present a generalized Bayes rule to obtain Bayes density estimators with respect to any α‐divergence, including the Kullback‐Leibler divergence and the Hellinger distance. For curved exponential models, we study the asymptotic behaviour of these predictive densities. We show that, whatever prior we use, the generalized Bayes rule improves (in a non‐Bayesian sense) the estimative density corresponding to a bias modification of the maximum likelihood estimator. It gives rise to a correspondence between choosing a prior density for the generalized Bayes rule and fixing a bias for the maximum likelihood estimator in the classical setting. A criterion for comparing and selecting prior densities is also given.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here