Premium
Use of Kullback–Leibler divergence for forgetting
Author(s) -
Kárný Miroslav,
Andrýsek Josef
Publication year - 2009
Publication title -
international journal of adaptive control and signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.73
H-Index - 66
eISSN - 1099-1115
pISSN - 0890-6327
DOI - 10.1002/acs.1080
Subject(s) - divergence (linguistics) , forgetting , kullback–leibler divergence , estimator , mathematics , exponential family , probability density function , parametric statistics , order (exchange) , computer science , mathematical optimization , statistical physics , statistics , physics , philosophy , linguistics , finance , economics
Non‐symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo ( Ann. Stat. 1979; 7 (3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated. This practical result is of importance for adaptive systems and opens a way for improving the functional approximation. Copyright © 2008 John Wiley & Sons, Ltd.