z-logo
Premium
JAMES‐STEIN RULE ESTIMATORS IN LINEAR REGRESSION MODELS WITH MULTIVARIATE‐t DISTRIBUTED ERROR
Author(s) -
Singh Radhey S.
Publication year - 1991
Publication title -
australian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 0004-9581
DOI - 10.1111/j.1467-842x.1991.tb00422.x
Subject(s) - estimator , mathematics , statistics , multivariate statistics , mean squared error , variance function , minimum variance unbiased estimator , covariance matrix , multivariate normal distribution
Summary This paper considers estimation of β in the regression model y = Xβ + μ, where the error components in μ have the jointly multivariate Student‐ t distribution. A family of James‐Stein type estimators (characterised by nonstochastic scalars) is presented. Sufficient conditions involving only X are given, under which these estimators are better (with respect to the risk under a general quadratic loss function) than the usual minimum variance unbiased estimator (MVUE) of β. Approximate expressions for the bias, the risk, the mean square error matrix and the variance‐covariance matrix for the estimators in this family are obtained. A necessary and sufficient condition for the dominance of this family over MVUE is also given.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here