z-logo
Premium
Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models
Author(s) -
Johansen Søren,
Nielsen Bent
Publication year - 2016
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/sjos.12174
Subject(s) - outlier , mathematics , anomaly detection , estimator , series (stratigraphy) , robust regression , asymptotic analysis , algorithm , robust statistics , statistics , poisson distribution , time series , artificial intelligence , computer science , paleontology , biology
Outlier detection algorithms are intimately connected with robust statistics that down‐weight some observations to zero. We define a number of outlier detection algorithms related to the Huber‐skip and least trimmed squares estimators, including the one‐step Huber‐skip estimator and the forward search. Next, we review a recently developed asymptotic theory of these. Finally, we analyse the gauge, the fraction of wrongly detected outliers, for a number of outlier detection algorithms and establish an asymptotic normal and a Poisson theory for the gauge.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here