Premium
A correspondence principle for relative entropy minimization
Author(s) -
Brown Donald E.,
Smith Robert L.
Publication year - 1990
Publication title -
naval research logistics (nrl)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.665
H-Index - 68
eISSN - 1520-6750
pISSN - 0894-069X
DOI - 10.1002/1520-6750(199004)37:2<191::aid-nav3220370202>3.0.co;2-c
Subject(s) - kullback–leibler divergence , mathematics , principle of maximum entropy , probability distribution , maximum entropy probability distribution , inference , minification , entropy (arrow of time) , entropy maximization , joint probability distribution , mathematical optimization , computer science , statistics , artificial intelligence , physics , quantum mechanics
Relative entropy minimization has been proposed as an inference method for problems with information in the form of constraints on the underlying probability model. We provide a theoretical justification for this procedure through a correspondence principle. In particular, for a convex constraints set Λ, we show that as the number of trials increases, the empirical distribution constrained to lie within Λ and associated with a discrete probability distribution p will become arbitrarily close with high probability to the distribution that minimizes the relative entropy between p and Λ.