Premium
Theory and Methods: On Minimum Distance Estimation Using Kolmogorov‐Lévy Type Metrics
Author(s) -
Kozek Andrzej S.
Publication year - 1998
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/1467-842x.00036
Subject(s) - mathematics , kolmogorov–smirnov test , estimator , independent and identically distributed random variables , combinatorics , type (biology) , infimum and supremum , random variable , differentiable function
Let X 1 , . . ., X n be independent identically distributed random variables with a common continuous (cumulative) distribution function (d.f.) F , and F^ n the empirical d.f. (e.d.f.) based on X 1 , . . ., X n . Let G be a smooth d.f. and Gθ = G (·–θ) its translation through θ∈ R . Using a Kolmogorov‐Lévy type metric ρ α defined on the space of d.f.s. on R , the paper derives both null and non‐null limiting distributions of √ n [ ρ α ( F n , Gθ n ) – ρ α ( F, G θ )], √ n (θ n –θ) and √ nρ α ( Gθ , Gθ ), where θ n and θ are the minimum ρ α ‐distance parameters for F n and F from G , respectively. These distributions are known explicitly in important particular cases; with some complementary Monte Carlo simulations, they help us clarify our understanding of estimation using minimum distance methods and supremum type metrics. We advocate use of the minimum distance method with supremum type metrics in cases of non‐null models. The resulting functionals are Hadamard differentiable and efficient. For small scale parameters the minimum distance functionals are close to medians of the parent distributions. The optimal small scale models result in minimum distance estimators having asymptotic variances very competitive and comparable with best known robust estimators.