z-logo
open-access-imgOpen Access
Spatial‐temporal attention wavenet: A deep learning framework for traffic prediction considering spatial‐temporal dependencies
Author(s) -
Tian Chenyu,
Chan Wai Kin Victor
Publication year - 2021
Publication title -
iet intelligent transport systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.579
H-Index - 45
eISSN - 1751-9578
pISSN - 1751-956X
DOI - 10.1049/itr2.12044
Subject(s) - computer science , dependency (uml) , graph , data mining , artificial intelligence , deep learning , embedding , task (project management) , machine learning , theoretical computer science , engineering , systems engineering
Traffic prediction on road networks is highly challenging due to the complexity of traffic systems and is a crucial task in successful intelligent traffic system applications. Existing approaches mostly capture the static spatial dependency relying on the prior knowledge of the graph structure. However, the spatial dependency can be dynamic, and sometimes the physical structure may not reflect the genuine relationship between roads. To better capture the complex spatial‐temporal dependencies and forecast traffic conditions on road networks, a multi‐step prediction model named Spatial‐Temporal Attention Wavenet (STAWnet) is proposed. Temporal convolution is applied to handle long time sequences, and the dynamic spatial dependencies between different nodes can be captured using the self‐attention network. Different from existing models, STAWnet does not need prior knowledge of the graph by developing a self‐learned node embedding. These components are integrated into an end‐to‐end framework. The experimental results on three public traffic prediction datasets (METR‐LA, PEMS‐BAY, and PEMS07) demonstrate effectiveness. In particular, in the 1 h ahead prediction, STAWnet outperforms state‐of‐the‐art methods with no prior knowledge of the network.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here