GMAN: A Graph Multi-Attention Network for Traffic Prediction

被引:0
|
作者
Zheng, Chuanpan [1 ,2 ,3 ]
Fan, Xiaoliang [1 ,2 ,3 ]
Wang, Cheng [1 ,2 ,3 ]
Qi, Jianzhong [4 ]
机构
[1] Xiamen Univ, Fujian Key Lab Sensing & Comp Smart Cities, Xiamen, Peoples R China
[2] Xiamen Univ, Digital Fujian Inst Urban Traff Big Data Res, Xiamen, Peoples R China
[3] Xiamen Univ, Sch Informat, Xiamen, Peoples R China
[4] Univ Melbourne, Sch Comp & Informat Syst, Melbourne, Vic, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Long-term traffic prediction is highly challenging due to the complexity of traffic systems and the constantly changing nature of many impacting factors. In this paper. we focus on the spatio-temporal factors, and propose a graph multi-attention network (GMAN) to predict traffic conditions for time steps ahead at different locations on a road network graph. GMAN adapts an encoder-decoder architecture, where both the encoder and the decoder consist of multiple spatio-temporal attention blocks to model the impact of the spatio-temporal factors on traffic conditions. The encoder encodes the input traffic features and the decoder predicts the output sequence. Between the encoder and the decoder, a transform attention layer is applied to convert the encoded traffic features to generate the sequence representations of future time steps as the input of the decoder. The transform attention mechanism models the direct relationships between historical and future time steps that helps to alleviate the error propagation problem among prediction time steps. Experimental results on two real-world traffic prediction tasks (i.e., traffic volume prediction and traffic speed prediction) demonstrate the superiority of GMAN. In particular, in the 1 hour ahead prediction. GMAN outperforms state-of-the-art methods by up to 4% improvement in MAE measure.
引用
收藏
页码:1234 / 1241
页数:8
相关论文
共 50 条
  • [1] Asymmetric Long-Term Graph Multi-Attention Network for Traffic Speed Prediction
    Hwang, Jiyoung
    Noh, Byeongjoon
    Jin, Zhixiong
    Yeo, Hwasoo
    2022 IEEE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2022, : 1498 - 1503
  • [2] Multi-View Multi-Attention Graph Neural Network for Traffic Flow Forecasting
    Wu, Fei
    Zheng, Changjiang
    Zhang, Chen
    Ma, Junze
    Sun, Kai
    APPLIED SCIENCES-BASEL, 2023, 13 (02):
  • [3] Multi-attention gated temporal graph convolution neural Network for traffic flow forecasting
    Huang, Xiaohui
    Wang, Junyang
    Jiang, Yuan
    Lan, Yuanchun
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (10): : 13795 - 13808
  • [4] A graph multi-attention network for predicting airport delays
    Zheng, Hongfeng
    Wang, Ziming
    Zheng, Chuanpan
    Wang, Yanjun
    Fan, Xiaoliang
    Cong, Wei
    Hu, Minghua
    TRANSPORTATION RESEARCH PART E-LOGISTICS AND TRANSPORTATION REVIEW, 2024, 181
  • [5] MDGNN: Microbial Drug Prediction Based on Heterogeneous Multi-Attention Graph Neural Network
    Pi, Jiangsheng
    Jiao, Peishun
    Zhang, Yang
    Li, Junyi
    FRONTIERS IN MICROBIOLOGY, 2022, 13
  • [6] Kernel multi-attention neural network for knowledge graph embedding
    Jiang, Dan
    Wang, Ronggui
    Yang, Juan
    Xue, Lixia
    KNOWLEDGE-BASED SYSTEMS, 2021, 227
  • [7] Multi-attention associate prediction network for visual tracking☆
    Sun, Xinglong
    Sun, Haijiang
    Jiang, Shan
    Wang, Jiacheng
    Wei, Xilai
    Hu, Zhonghe
    NEUROCOMPUTING, 2025, 614
  • [8] Kernel multi-attention neural network for knowledge graph embedding
    Jiang, Dan
    Wang, Ronggui
    Yang, Juan
    Xue, Lixia
    Knowledge-Based Systems, 2021, 227
  • [9] A trend graph attention network for traffic prediction
    Wang, Chu
    Tian, Ran
    Hu, Jia
    Ma, Zhongyu
    INFORMATION SCIENCES, 2023, 623 : 275 - 292
  • [10] Learning Knowledge Graph Embeddings by Multi-Attention Mechanism for Link Prediction
    Wang, Meihong
    Li, Han
    Qiu, Linling
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT I, 2022, 13155 : 33 - 49