z-logo
open-access-imgOpen Access
Social graph convolutional LSTM for pedestrian trajectory prediction
Author(s) -
Zhou Yutao,
Wu Huayi,
Cheng Hongquan,
Qi Kunlun,
Hu Kai,
Kang Chaogui,
Zheng Jie
Publication year - 2021
Publication title -
iet intelligent transport systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.579
H-Index - 45
eISSN - 1751-9578
pISSN - 1751-956X
DOI - 10.1049/itr2.12033
Subject(s) - pedestrian , computer science , graph , pairwise comparison , artificial intelligence , trajectory , convolutional neural network , machine learning , theoretical computer science , engineering , transport engineering , physics , astronomy
Understanding the movement of pedestrians and predicting their future trajectory can be very important in intelligent transportation systems because accurate pedestrian trajectory prediction will improve the level of autonomous driving technology and reduce traffic accidents. The authors address this problem with a social graph convolutional long short‐term memory neural network architecture by considering the movement information of each pedestrian and its interaction with neighbours. Specifically, the authors use a graph to model pedestrian walking state where nodes denote the pedestrian movement information, and edges represent the interactions between pairwise pedestrians. An end‐to‐end architecture that combines a sequence‐to‐sequence model with a graph convolutional network to learn the movement features and interaction features is used. To capture the interaction influence on different pedestrians, an emotion gate to refine the learned features and filter out useless information is introduced. A companion loss function to increase the ability of the network to capture ‘walking in groups’ behaviour is further proposed. Through experiments on two public datasets (ETH and UCY), the authors prove that our method outperforms the previous methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here