Analyzing bin-width effect on the computed entropy
Author(s) -
Sri Purwani,
Julita Nahar,
Carole Twining
Publication year - 2017
Publication title -
aip conference proceedings
Language(s) - English
Resource type - Conference proceedings
SCImago Journal Rank - 0.177
H-Index - 75
eISSN - 1551-7616
pISSN - 0094-243X
DOI - 10.1063/1.4995123
Subject(s) - maximum entropy probability distribution , joint entropy , mathematics , entropy (arrow of time) , rényi entropy , maximum entropy spectral estimation , bin , entropy rate , joint quantum entropy , maximum entropy thermodynamics , min entropy , principle of maximum entropy , information theory , randomness , differential entropy , upper and lower bounds , gaussian , algorithm , statistics , mathematical analysis , physics , quantum mechanics
The Shannon entropy is a mathematical expression for quantifying the amount of randomness which can be used to measure information content. It is used in objective function. Mutual Information (MI) uses Shannon entropy in order to determine shared information content of two images. The Shannon entropy, which was originally derived by Shannon in the context of lossless encoding of messages, is also used to define an optimum message length used in the Minimum Description Length (MDL) principle for groupwise registration. Majority of papers used histogram for computing MI, and hence the entropy. We therefore, aim to analyze the effect of bin-width on the computed entropy. We first derived the Shannon entropy from the integral of probability density function (pdf), and found that Gaussian has maximum entropy over all possible distribution. We also show that the entropy of the flat distribution is less than the entropy of the Gaussian distribution with the same variance. We then investigated the bin-width effe...
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom