z-logo
Premium
On Bayesian selection of the best normal population using theKullback–Leibler divergence measure
Author(s) -
Thabane L.,
Safiul Haq M.
Publication year - 1999
Publication title -
statistica neerlandica
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.52
H-Index - 39
eISSN - 1467-9574
pISSN - 0039-0402
DOI - 10.1111/1467-9574.00116
Subject(s) - divergence (linguistics) , population , selection (genetic algorithm) , kullback–leibler divergence , measure (data warehouse) , mathematics , multivariate statistics , bayesian probability , statistics , set (abstract data type) , computer science , artificial intelligence , data mining , philosophy , linguistics , demography , sociology , programming language
In this paper, we use the Bayesian approach to study the problem of selecting the best population among k different populations π 1 , ..., π k (k≥2) relative to some standard (or control) population π 0 . Here, π 0 is considered to be the population with the desired characteristics. The best population is defined to be the one which is closest to the ideal population π 0 . The procedure uses the idea of minimizing the posterior expected value of the Kullback–Leibler (KL) divergence measure of π i from π 0 . The populations under consideration are assumed to be multivariate normal. An application to regression problems is also presented. Finally, a numerical example using real data set is provided to illustrate the implementation of the selection procedure.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here