z-logo
open-access-imgOpen Access
Estimation of Information Measures for Power-Function Distribution in Presence of Outliers and Their Applications
Author(s) -
Amal S. Hassan,
El-Sayed A. El-Sherpieny,
Rokaya Elmorsy Mohamed
Publication year - 2021
Publication title -
journal of ict
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.217
H-Index - 10
eISSN - 2180-3862
pISSN - 1675-414X
DOI - 10.32890/jict2022.21.1.1
Subject(s) - outlier , prior probability , estimator , statistics , bayesian average , bayesian probability , mathematics , mean squared error , bayes estimator , principle of maximum entropy , gibbs sampling , monte carlo method , algorithm , computer science , bayesian statistics , bayesian inference
The measure of entropy has an undeniable pivotal role in the field of information theory. This article estimates the Rényi and q-entropies of the power function distribution in the presence of s outliers. The maximum likelihood estimators as well as the Bayesian estimators under uniform and gamma priors are derived. The proposed Bayesian estimators of entropies under symmetric and asymmetric loss functions are obtained. These estimators are computed empirically using Monte Carlo simulation based on Gibbs sampling. Outcomes of the study showed that the precision of the maximum likelihood and Bayesian estimates of both entropies measures improves with sample sizes. The behavior of both entropies estimates increase with number of outliers. Further, Bayesian estimates of the Rényi and q-entropies under squared error loss function are preferable than the other Bayesian estimates under the other loss functions in most of cases. Eventually, real data examples are analyzed to illustrate the theoretical results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here