Premium
Parameter identifiability with Kullback–Leibler information divergence criterion
Author(s) -
Chen Badong,
Hu Jinchun,
Zhu Yu,
Sun Zengqi
Publication year - 2009
Publication title -
international journal of adaptive control and signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.73
H-Index - 66
eISSN - 1099-1115
pISSN - 0890-6327
DOI - 10.1002/acs.1078
Subject(s) - identifiability , mathematics , fisher information , divergence (linguistics) , estimation theory , consistency (knowledge bases) , information criteria , kullback–leibler divergence , statistics , mathematical optimization , model selection , discrete mathematics , philosophy , linguistics
We study the problem of parameter identifiability with Kullback–Leibler information divergence (KLID) criterion. The KLID‐identifiability is defined, which can be related to many other concepts of identifiability, such as the identifiability with Fisher's information matrix criterion, identifiability with least‐squares criterion, and identifiability with spectral density criterion. We also establish a simple check criterion for the Gaussian process and derive an upper bound for the minimal identifiable horizon of Markov process. Furthermore, we define the asymptotic KLID‐identifiability and prove that, under certain constraints, the KLID‐identifiability will be a sufficient or necessary condition for the asymptotic KLID‐identifiability. The consistency problems of several parameter estimation methods are also discussed. Copyright © 2008 John Wiley & Sons, Ltd.