z-logo
Premium
Single‐parameter inference based on partial prior information
Author(s) -
Lambert Diane,
Duncan George T.
Publication year - 1986
Publication title -
canadian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.804
H-Index - 51
eISSN - 1708-945X
pISSN - 0319-5724
DOI - 10.2307/3315187
Subject(s) - prior probability , estimator , dirichlet distribution , inference , event (particle physics) , mathematics , point estimation , dirichlet process , bayesian probability , partial derivative , posterior probability , bayesian inference , statistics , computer science , artificial intelligence , mathematical analysis , physics , quantum mechanics , boundary value problem
Partial specification of a prior distribution can be appealing to an analyst, but there is no conventional way to update a partial prior. In this paper, we show how a framework for Bayesian updating with data can be based on the Dirichlet(a) process. Within this framework, partial information predictors generalize standard minimax predictors and have interesting multiple‐point shrinkage properties. Approximations to partial‐information estimators for squared error loss are defined straightforwardly, and an estimate of the mean shrinks the sample mean. The proposed updating of the partial prior is a consequence of four natural requirements when the Dirichlet parameter a is continuous. Namely, the updated partial posterior should be calculable from knowledge of only the data and partial prior, it should be faithful to the full posterior distribution, it should assign positive probability to every observed event { X ,}, and it should not assign probability to unobserved events not included in the partial prior specification.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here