z-logo
open-access-imgOpen Access
On information gain, Kullback-Leibler divergence, entropy production and the involution kernel
Author(s) -
Artur O. Lopes,
Jairo K. Mengue
Publication year - 2022
Publication title -
discrete and continuous dynamical systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.289
H-Index - 70
eISSN - 1553-5231
pISSN - 1078-0947
DOI - 10.3934/dcds.2022026
Subject(s) - mathematics , kullback–leibler divergence , combinatorics , statistics
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, which extends the concept of Shannon entropy, plays a fundamental role. Given an a priori probability kernel \begin{document}$ \hat{\nu} $\end{document} and a probability \begin{document}$ \pi $\end{document} on the measurable space \begin{document}$ X\times Y $\end{document} we consider an appropriate definition of entropy of \begin{document}$ \pi $\end{document} relative to \begin{document}$ \hat{\nu} $\end{document} , which is based on previous works. Using this concept of entropy we obtain a natural definition of information gain for general measurable spaces which coincides with the mutual information given from the K-L divergence in the case \begin{document}$ \hat{\nu} $\end{document} is identified with a probability \begin{document}$ \nu $\end{document} on \begin{document}$ X $\end{document} . This will be used to extend the meaning of specific information gain and dynamical entropy production to the model of thermodynamic formalism for symbolic dynamics over a compact alphabet (TFCA model). Via the concepts of involution kernel and dual potential, one can ask if a given potential is symmetric - the relevant information is available in the potential. In the affirmative case, its corresponding equilibrium state has zero entropy production.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here