Encoding Crowd Interaction with Deep Neural Network for Pedestrian Trajectory Prediction

被引:172
|
作者
Xu, Yanyu [1 ]
Piao, Zhixin [1 ]
Gao, Shenghua [1 ]
机构
[1] ShanghaiTech Univ, Shanghai, Peoples R China
关键词
BEHAVIORS; MODEL;
D O I
10.1109/CVPR.2018.00553
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pedestrian trajectory prediction is a challenging task because of the complex nature of humans. In this paper, we tackle the problem within a deep learning framework by considering motion information of each pedestrian and its interaction with the crowd. Specifically, motivated by the residual learning in deep learning, we propose to predict displacement between neighboring frames for each pedestrian sequentially. To predict such displacement, we design a crowd interaction deep neural network (CIDNN) which considers the different importance of different pedestrians for the displacement prediction of a target pedestrian. Specifically, we use an LSTM to model motion information for all pedestrians and use a multi-layer perceptron to map the location of each pedestrian to a high dimensional feature space where the inner product between features is used as a measurement for the spatial affinity between two pedestrians. Then we weight the motion features of all pedestrians based on their spatial affinity to the target pedestrian for location displacement prediction. Extensive experiments on publicly available datasets validate the effectiveness of our method for trajectory prediction.
引用
收藏
页码:5275 / 5284
页数:10
相关论文
共 50 条
  • [1] CIRAN: extracting crowd interaction with residual attention network for pedestrian trajectory prediction
    Shang Liu
    Xiaoyu Chen
    Hao Chen
    International Journal of Machine Learning and Cybernetics, 2022, 13 : 2649 - 2662
  • [2] CIRAN: extracting crowd interaction with residual attention network for pedestrian trajectory prediction
    Liu, Shang
    Chen, Xiaoyu
    Chen, Hao
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (09) : 2649 - 2662
  • [3] A deep neural network approach for pedestrian trajectory prediction considering flow heterogeneity
    Esfahani, Hossein Nasr
    Song, Ziqi
    Christensen, Keith
    TRANSPORTMETRICA A-TRANSPORT SCIENCE, 2023, 19 (01)
  • [4] StarNet: Pedestrian Trajectory Prediction using Deep Neural Network in Star Topology
    Zhu, Yanliang
    Qian, Deheng
    Ren, Dongchun
    Xia, Huaxia
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 8075 - 8080
  • [5] Pedestrian Trajectory Prediction Based on Deep Convolutional LSTM Network
    Song, Xiao
    Chen, Kai
    Li, Xu
    Sun, Jinghan
    Hou, Baocun
    Cui, Yong
    Zhang, Baochang
    Xiong, Gang
    Wang, Zilie
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (06) : 3285 - 3302
  • [6] Graph Partition Convolution Neural Network for Pedestrian Trajectory Prediction
    Wang, Ruiyang
    Li, Ming
    Zhang, Pin
    Wen, Fan
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 457 - 462
  • [7] Pedestrian Trajectory Prediction via Spatial Interaction Transformer Network
    Su, Tong
    Meng, Yu
    Xu, Yan
    2021 IEEE INTELLIGENT VEHICLES SYMPOSIUM WORKSHOPS (IV WORKSHOPS), 2021, : 154 - 159
  • [8] Probabilistic Crowd GAN: Multimodal Pedestrian Trajectory Prediction Using a Graph Vehicle-Pedestrian Attention Network
    Eiffert, Stuart
    Li, Kunming
    Shan, Mao
    Worrall, Stewart
    Sukkarieh, Salah
    Nebot, Eduardo
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (04): : 5026 - 5033
  • [9] Fast trajectory extraction and pedestrian dynamics analysis using deep neural network
    Yi, Ruolong
    Du, Mingyu
    Song, Weiguo
    Zhang, Jun
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2024, 638
  • [10] Spatial-Temporal Dual Graph Neural Network for Pedestrian Trajectory Prediction
    Zou, Yuming
    Piao, Xinglin
    Zhang, Yong
    Hu, Yongli
    39TH YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION, YAC 2024, 2024, : 1212 - 1217