z-logo
open-access-imgOpen Access
Information Measures via Copula Functions
Author(s) -
R. Mohtashami Borzadaran,
Mohammad Amini
Publication year - 2010
Publication title -
journal of statistical research of iran
Language(s) - English
Resource type - Journals
ISSN - 1735-1294
DOI - 10.18869/acadpub.jsri.7.1.47
Subject(s) - hellinger distance , copula (linguistics) , mathematics , kullback–leibler divergence , divergence (linguistics) , parametric statistics , inference , differential entropy , nonparametric statistics , statistics , econometrics , computer science , artificial intelligence , joint entropy , principle of maximum entropy , philosophy , linguistics
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, α-Divergence, . . . and so on. Properties and results related to distance between probability distributions derived via copula functions. Some inequalities are obtained in view of the dependence and information measures.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom