STAGP: Spatio-Temporal Adaptive Graph Pooling Network for Pedestrian Trajectory Prediction

被引:4
|
作者
Liu, Zhening [1 ]
He, Li [1 ]
Yuan, Liang [2 ]
Lv, Kai [1 ]
Zhong, Runhao [1 ]
Chen, Yaohua [1 ]
机构
[1] Xinjiang Univ, Intelligent Mfg Modern Ind Coll, Sch Mech Engn, Urumqi 830017, Peoples R China
[2] Beijing Univ Chem Technol, Sch Informat Sci & Technol, Beijing 100029, Peoples R China
基金
中国国家自然科学基金;
关键词
Pedestrians; Trajectory; Adaptation models; Feature extraction; Video surveillance; Transformers; Stars; Deep learning methods; human and humanoid motion analysis and synthesis; social HRI; ATTENTION; MODEL; LSTM;
D O I
10.1109/LRA.2023.3346806
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Predicting how pedestrians will move in the future is crucial for robot navigation, autonomous driving, and video surveillance. The complex interactions among pedestrians make it difficult to predict their future trajectory. Previous studies have primarily focused on modeling the interaction features of all pedestrians in the scene. However, this approach often results in an abundance of irrelevant interactions and overlooks the time-dependent characteristics. To address these issues, we propose a spatio-temporal adaptive graph pooling network (STAGP) for pedestrian trajectory prediction. STAGP adopts adaptive graph pooling to explicitly model interactions between pedestrians, redundant interactions and establishing directed interactions. In addition, we utilize spatio-temporal attention to extract temporal features of pedestrian interactions. For the prediction of future trajectories, we use a time-extrapolator convolutional neural network (TXP-CNN). ETH and UCY datasets were used to evaluate STAGP. Comparing STAGP to the values, the experimental results indicate that it is competitive in terms of ADE and FDE metrics.
引用
收藏
页码:2001 / 2007
页数:7
相关论文
共 50 条
  • [41] A spatio-temporal grammar graph attention network with adaptive edge information for traffic flow prediction
    Zhao Zhang
    Xiaohong Jiao
    Applied Intelligence, 2023, 53 : 28787 - 28803
  • [42] Flow prediction via adaptive dynamic graph with spatio-temporal correlations
    Zhang, Hui
    Ding, Kun
    Xie, Jietao
    Xiao, Weidong
    Xie, Yuxiang
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 261
  • [43] Research on traffic flow prediction based on adaptive spatio-temporal perceptual graph neural network for traffic prediction
    Liang, Qian
    Yin, Xiang
    Xia, Chengliang
    Chen, Ye
    ACM International Conference Proceeding Series, : 1101 - 1105
  • [44] Traffic Prediction Model Based on Spatio-temporal Graph Attention Network
    Chen, Jing
    Wang, Linkai
    Wang, Wei
    Song, Ruizhuo
    2022 4TH INTERNATIONAL CONFERENCE ON CONTROL AND ROBOTICS, ICCR, 2022, : 428 - 432
  • [45] A Spatio-Temporal Graph Neural Network Approach for Traffic Flow Prediction
    Li, Yanbing
    Zhao, Wei
    Fan, Huilong
    MATHEMATICS, 2022, 10 (10)
  • [46] Deep spatio-temporal graph convolutional network for traffic accident prediction
    Yu, Le
    Du, Bowen
    Hu, Xiao
    Sun, Leilei
    Han, Liangzhe
    Lv, Weifeng
    NEUROCOMPUTING, 2021, 423 (423) : 135 - 147
  • [47] A spatio-temporal graph neural network for fall prediction with inertial sensors
    Wang, Shu
    Li, Xiaohu
    Liao, Guorui
    Liu, Jiawei
    Liao, Changbo
    Liu, Ming
    Liao, Jun
    Liu, Li
    KNOWLEDGE-BASED SYSTEMS, 2024, 293
  • [48] Spatial-Temporal Dual Graph Neural Network for Pedestrian Trajectory Prediction
    Zou, Yuming
    Piao, Xinglin
    Zhang, Yong
    Hu, Yongli
    39TH YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION, YAC 2024, 2024, : 1212 - 1217
  • [49] Trajectory Parsing by Cluster Sampling in Spatio-temporal Graph
    Liu, Xiaobai
    Lin, Liang
    Zhu, Song-Chun
    Jin, Hai
    CVPR: 2009 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-4, 2009, : 739 - +
  • [50] Sparse Transformer Network With Spatial-Temporal Graph for Pedestrian Trajectory Prediction
    Gao, Long
    Gu, Xiang
    Chen, Feng
    Wang, Jin
    IEEE ACCESS, 2024, 12 : 144725 - 144737