z-logo
Premium
Least tail‐trimmed squares for infinite variance autoregressions
Author(s) -
Hill Jonathan B.
Publication year - 2013
Publication title -
journal of time series analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.576
H-Index - 54
eISSN - 1467-9892
pISSN - 0143-9782
DOI - 10.1111/jtsa.12005
Subject(s) - mathematics , estimator , rate of convergence , trimming , robustness (evolution) , asymptotic distribution , statistics , covariance , computer science , computer network , channel (broadcasting) , biochemistry , chemistry , gene , operating system
We develop a robust least squares estimator for autoregressions with possibly heavy tailed errors. Robustness to heavy tails is ensured by negligibly trimming the squared error according to extreme values of the error and regressors. Tail‐trimming ensures asymptotic normality and super‐‐convergence with a rate comparable to the highest achieved amongst M‐estimators for stationary data. Moreover, tail‐trimming ensures robustness to heavy tails in both small and large samples. By comparison, existing robust estimators are not as robust in small samples, have a slower rate of convergence when the variance is infinite, or are not asymptotically normal. We present a consistent estimator of the covariance matrix and treat classic inference without knowledge of the rate of convergence. A simulation study demonstrates the sharpness and approximate normality of the estimator, and we apply the estimator to financial returns data. Finally, tail‐trimming can be easily extended beyond least squares estimation for a linear stationary AR model. We discuss extensions to quasi‐maximum likelihood for GARCH, weighted least squares for a possibly non‐stationary random coefficient autoregression, and empirical likelihood for robust confidence region estimation, in each case for models with possibly heavy tailed errors.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here