z-logo
open-access-imgOpen Access
STAD: Self-supervised Transformer for Anomaly Detection in Multi-Variate Time Series Data
Author(s) -
Saba Arshad,
Minho Ha,
Tae-Hyoung Park
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3616597
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Anomaly detection in multivariate time series has gained significant attention in past few years. The rarity of anomalies, considerable data volatility, absence of anomaly labels, and need for real-time inference in modern applications makes it a challenging problem. A robust training mechanism is desired that can accurately detect anomalous points in dataset with significantly large sized normal data. Moreover, multivariate data recorded by multiple sensors significantly varies from normal values in case of any defect. It is crucial to detect which specific sensor’s data are abnormal for system analysis and correction. To address these issues, we present a self-supervised transformer for anomaly detection, STAD, which enables robust learning from unlabeled data by synthesizing pseudo-anomalies during training. The research contributions are manifold. Firstly, a synthetic anomaly injection method is developed for anomaly generation and self-supervised training on critical anomalous behaviors. Secondly, a multi-head context attention module is designed and embedded with Transformer architecture to map the local and global associations, effectively distinguishing rare anomalies. Lastly, we propose attentive class activation tokens mapping mechanism that facilitates anomaly attribution by tracing the influence of individual sensor inputs through the Transformer model, offering transparent and interpretable insights. Extensive experiments conducted on eight real-world datasets demonstrate that STAD consistently outperforms state-of-the-art unsupervised and semi-supervised approaches, achieving superior F1 scores and interpretability while maintaining computational efficiency suitable for real-time deployment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom