z-logo
Premium
Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting
Author(s) -
Cai Ling,
Janowicz Krzysztof,
Mai Gengchen,
Yan Bo,
Zhu Rui
Publication year - 2020
Publication title -
transactions in gis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.721
H-Index - 63
eISSN - 1467-9671
pISSN - 1361-1682
DOI - 10.1111/tgis.12644
Subject(s) - computer science , recurrent neural network , leverage (statistics) , artificial intelligence , deep learning , convolutional neural network , dependency (uml) , transformer , machine learning , feature learning , time series , margin (machine learning) , long short term memory , graph , sequence learning , artificial neural network , theoretical computer science , engineering , voltage , electrical engineering
Traffic forecasting is a challenging problem due to the complexity of jointly modeling spatio‐temporal dependencies at different scales. Recently, several hybrid deep learning models have been developed to capture such dependencies. These approaches typically utilize convolutional neural networks or graph neural networks (GNNs) to model spatial dependency and leverage recurrent neural networks (RNNs) to learn temporal dependency. However, RNNs are only able to capture sequential information in the time series, while being incapable of modeling their periodicity (e.g., weekly patterns). Moreover, RNNs are difficult to parallelize, making training and prediction less efficient. In this work we propose a novel deep learning architecture called Traffic Transformer to capture the continuity and periodicity of time series and to model spatial dependency. Our work takes inspiration from Google’s Transformer framework for machine translation. We conduct extensive experiments on two real‐world traffic data sets, and the results demonstrate that our model outperforms baseline models by a substantial margin.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here