ST-GAT: A Spatio-Temporal Graph Attention Network for Accurate Traffic Speed Prediction

被引:7
|
作者
Song, Junho [1 ]
Son, Jiwon [1 ]
Seo, Dong-hyuk [1 ]
Han, Kyungsik [1 ]
Kim, Namhyuk [2 ]
Kim, Sang-Wook [1 ]
机构
[1] Hanyang Univ, Seoul, South Korea
[2] Hyundai Motor Co, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Traffic speed prediction; Attention network; Spatio-temporal data;
D O I
10.1145/3511808.3557705
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Spatio-temporal models, which combine GNNs (Graph Neural Networks) and RNNs (Recurrent Neural Networks), have shown state-of-the-art accuracy in traffic speed prediction. However, we find that they consider the spatial and temporal dependencies between speeds separately in the two (i.e., space and time) dimensions, thereby unable to exploit the joint-dependencies of speeds in space and time. In this paper, with the evidence via preliminary analysis, we point out the importance of considering individual dependencies between two speeds from all possible points in space and time for accurate traffic speed prediction. Then, we propose an Individual Spatio-Temporal graph (IST-graph) that represents the Individual Spatio-Temporal dependencies (IST-dependencies) very effectively and a Spatio-Temporal Graph ATtention network (ST-GAT), a novel model to predict the future traffic speeds based on the IST-graph and the attention mechanism. The results from our extensive evaluation with five real-world datasets demonstrate (1) the effectiveness of the IST-graph in modeling traffic speed data, (2) the superiority of ST-GAT over 5 state-of-the-art models (i.e., 2-33% gains) in prediction accuracy, and (3) the robustness of our ST-GAT even in abnormal traffic situations.
引用
收藏
页码:4500 / 4504
页数:5
相关论文
共 50 条
  • [1] Traffic Prediction Model Based on Spatio-temporal Graph Attention Network
    Chen, Jing
    Wang, Linkai
    Wang, Wei
    Song, Ruizhuo
    2022 4TH INTERNATIONAL CONFERENCE ON CONTROL AND ROBOTICS, ICCR, 2022, : 428 - 432
  • [2] Spatio-temporal graph attention networks for traffic prediction
    Ma, Chuang
    Yan, Li
    Xu, Guangxia
    TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH, 2023, 16 (09): : 978 - 988
  • [3] ST-MGAT:Spatio-temporal multi-head graph attention network for Traffic prediction
    Wang, Bowen
    Wang, Jingsheng
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2022, 603
  • [4] ST-MAN: Spatio-Temporal Multimodal Attention Network for Traffic Prediction
    He, Ruozhou
    Li, Liting
    Hua, Bei
    Tong, Jianjun
    Tan, Chang
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, KSEM 2023, 2023, 14118 : 137 - 152
  • [5] STGATP: A Spatio-Temporal Graph Attention Network for Long-Term Traffic Prediction
    Zhu, Mengting
    Zhu, Xianqiang
    Zhu, Cheng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 255 - 266
  • [6] A mobility aware network traffic prediction model based on dynamic graph attention spatio-temporal network
    Jin, Zilong
    Qian, Jun
    Kong, Zhixiang
    Pan, Chengsheng
    COMPUTER NETWORKS, 2023, 235
  • [7] Adaptive spatio-temporal graph convolutional network with attention mechanism for mobile edge network traffic prediction
    Sha, Ning
    Wu, Xiaochun
    Wen, Jinpeng
    Li, Jinglei
    Li, Chuanhuang
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (09): : 13257 - 13272
  • [8] Dynamic Spatio-temporal traffic flow prediction based on multi fusion graph attention network
    Cheng, Manru
    Jiang, Guo-Ping
    Song, Yurong
    Yang, Chen
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 7285 - 7291
  • [9] Spatio-temporal causal graph attention network for traffic flow prediction in intelligent transportation systems
    Zhao, Wei
    Zhang, Shiqi
    Wang, Bei
    Zhou, Bing
    PeerJ Computer Science, 2023, 9
  • [10] Spatio-Temporal Graph Attention Convolution Network for Traffic Flow Forecasting
    Liu, Kun
    Zhu, Yifan
    Wang, Xiao
    Ji, Hongya
    Huang, Chengfei
    TRANSPORTATION RESEARCH RECORD, 2024, 2678 (09) : 136 - 149