z-logo
open-access-imgOpen Access
A Note on Bound for Jensen-Shannon Divergence by Jeffreys
Author(s) -
Takuya Yamano
Publication year - 2014
Language(s) - English
Resource type - Conference proceedings
DOI - 10.3390/ecea-1-b002
Subject(s) - divergence (linguistics) , upper and lower bounds , mathematics , kullback–leibler divergence , combinatorics , information theory , discrete mathematics , binary number , probability distribution , statistics , mathematical analysis , philosophy , arithmetic , linguistics
The Jensen-Shannon divergence JS(p;q) is a similarity measure between two probability distributions p and q. It is presently used in varied disciplines. In this presentation, we provide a lower bound on the Jensen-Shannon divergence by the Jeffrery's J-divergence when p_i≥q_i is satisfied. In the original Lin's paper, the upper bound in terms of the J-divergence was the quarter of it. Recently, the shaper one was reported by Crooks. We discuss upper bounds by transcendental functions of Jeffreys by comparing those values for a binary distribution.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom