GDFormer: A Graph Diffusing Attention based approach for Traffic Flow Prediction

被引:9
|
作者
Su, Jie [1 ,2 ]
Jin, Zhongfu [2 ]
Ren, Jie [3 ]
Yang, Jiandang [1 ]
Liu, Yong [1 ]
机构
[1] Zhejiang Univ, Dept Control, Hangzhou 310027, Peoples R China
[2] Zhejiang Commun Investment Grp Co Ltd, Intelligent Transportat Inst, Hangzhou 310000, Peoples R China
[3] Zhejiang Univ, Huzhou Inst, Huzhou 313000, Peoples R China
关键词
Graph neural network; Diffusion process; Attention mechanism; Traffic flow prediction; NETWORKS;
D O I
10.1016/j.patrec.2022.03.005
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a novel traffic flow prediction approach, called as Graph Diffusing trans-Former (GDFormer). GDFormer is in architecture of transformer, which is composed by the encoder sequence and decoder sequence. both of the encoder sequence and decoder sequence in GDFormer are constituted by the novel designed Graph Diffusing Attention (GDA) module and the auxiliaries. The GDA module utilizes the query-key-value attention to learn the diffusion parameters for each diffusion step, and dynamically updates the adjacency transition, which reflects the dynamically changing traffic flow between the traffic monitors. To verify the efficiency of our approach, we conduct a lot of experiments on two real-world data sets. With a comparison between our approach and the benchmarks, we find that our approach has achieved state of the art performance. Ablation experiments are conducted to illustrate the effectiveness of the key components in the model. For ease of reproducibility, the code, the processed real-world data sets and the evaluation results are available at https://github.com/dublinsky/GDFormer . (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:126 / 132
页数:7
相关论文
共 50 条
  • [1] A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention
    Tang, Cong
    Sun, Jingru
    Sun, Yichuang
    Peng, Mu
    Gan, Nianfei
    [J]. IEEE ACCESS, 2020, 8 : 153731 - 153741
  • [2] Traffic flow matrix-based graph neural network with attention mechanism for traffic flow prediction
    Chen, Jian
    Zheng, Li
    Hu, Yuzhu
    Wang, Wei
    Zhang, Hongxing
    Hu, Xiping
    [J]. INFORMATION FUSION, 2024, 104
  • [3] Road traffic flow prediction based on dynamic spatiotemporal graph attention network
    Chen, Yuguang
    Huang, Jintao
    Xu, Hongbin
    Guo, Jincheng
    Su, Linyong
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01)
  • [4] Road Network Traffic Flow Prediction Method Based on Graph Attention Networks
    Wang, Junqiang
    Yang, Shuqiang
    Gao, Ya
    Wang, Jun
    Alfarraj, Osama
    [J]. JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2024,
  • [5] Road traffic flow prediction based on dynamic spatiotemporal graph attention network
    Yuguang Chen
    Jintao Huang
    Hongbin Xu
    Jincheng Guo
    Linyong Su
    [J]. Scientific Reports, 13
  • [6] Spatiotemporal Graph Attention Networks for Urban Traffic Flow Prediction
    Zhao, Yuanpeng
    Xu, Yepeng
    He, Xitao
    Zhang, Dengyin
    [J]. 2022 IEEE 33RD ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2022, : 340 - 345
  • [7] GECRAN: Graph embedding based convolutional recurrent attention network for traffic flow prediction
    Yan, Jianqiang
    Zhang, Lin
    Gao, Yuan
    Qu, Boting
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [8] Attention-based Bicomponent Synchronous Graph Convolutional Network for traffic flow prediction
    Shen, Cheng
    Han, Kai
    Bi, Tianyuan
    [J]. 2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 778 - 785
  • [9] Graph Attention LSTM: A Spatiotemporal Approach for Traffic Flow Forecasting
    Zhang, Tianqi
    Guo, Ge
    [J]. IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, 2022, 14 (02) : 190 - 196
  • [10] Attention based spatiotemporal graph attention networks for traffic flow forecasting
    Wang, Yi
    Jing, Changfeng
    Xu, Shishuo
    Guo, Tao
    [J]. INFORMATION SCIENCES, 2022, 607 : 869 - 883