z-logo
Premium
Distributions of the Kullback–Leibler divergence with applications
Author(s) -
Belov Dmitry I.,
Armstrong Ronald D.
Publication year - 2011
Publication title -
british journal of mathematical and statistical psychology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.157
H-Index - 51
eISSN - 2044-8317
pISSN - 0007-1102
DOI - 10.1348/000711010x522227
Subject(s) - divergence (linguistics) , kullback–leibler divergence , distribution (mathematics) , mathematics , statistics , statistical physics , econometrics , mathematical analysis , physics , philosophy , linguistics
The Kullback–Leibler divergence (KLD) is a widely used method for measuring the fit of two distributions. In general, the distribution of the KLD is unknown. Under reasonable assumptions, common in psychometrics, the distribution of the KLD is shown to be asymptotically distributed as a scaled (non‐central) chi‐square with one degree of freedom or a scaled (doubly non‐central) F . Applications of the KLD for detecting heterogeneous response data are discussed with particular emphasis on test security.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here