z-logo
Premium
Estimation of the log‐normal mean
Author(s) -
Zhou XiaoHua
Publication year - 1998
Publication title -
statistics in medicine
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.996
H-Index - 183
eISSN - 1097-0258
pISSN - 0277-6715
DOI - 10.1002/(sici)1097-0258(19981015)17:19<2251::aid-sim925>3.0.co;2-w
Subject(s) - mean squared error , estimator , minimum variance unbiased estimator , efficient estimator , mathematics , statistics , bias of an estimator , consistent estimator , minimum mean square error , stein's unbiased risk estimate , trimmed estimator , invariant estimator
The most commonly used estimator for a log‐normal mean is the sample mean. In this paper, we show that this estimator can have a large mean square error, even for large samples. Then, we study three main alternative estimators: (i) a uniformly minimum variance unbiased (UMVU) estimator; (ii) a maximum likelihood (ML) estimator; (iii) a conditionally minimal mean square error (MSE) estimator. We find that the conditionally minimal MSE estimator has the smallest mean square error among the four estimators considered here, regardless of the sample size and the skewness of the log‐normal population. However, for large samples ( n ⩾200), the UMVU estimator, the ML estimator, and the conditionally minimal MSE estimators have very similar mean square errors. Since the ML estimator is the easiest to compute among these three estimators, for large samples we recommend the use of the ML estimator. For small to moderate samples, we recommend the use of the conditionally minimal MSE estimator. © 1998 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here