Robust Time Series Contrastive Representation Learning Via Explicit Seasonal-Trend Disentanglement
Author(s) -
Fuyu Li,
Bo Jin
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3621317
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
We address universal self-supervised representation learning for time series under label scarcity and distribution shifts. A key challenge is that prevalent encode-then-decompose pipelines only implicitly separate seasonal and trend signals, which mixes noise across components and degrades robustness, especially with sudden jumps or stochastic movements. In this work, we propose an explicit, decomposition-aware contrastive framework: original time series are split via a Fourier transform into seasonal (high-frequency) and trend (low-frequency) components. Seasonality is modeled by a frequency-enhanced attention encoder with amplitude-induced augmentations; trend is captured by a tree-structured causal convolution encoder with random perturbations to handle non-stationary drift. The two views are trained contrastively and then fused with lightweight fine-tuning, delivering state-of-the-art accuracy, improved robustness to jumps, and interpretable attributions.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom