EGFormer: An Enhanced Transformer Model with Efficient Attention Mechanism for Traffic Flow Forecasting

被引:3
|
作者
Yang, Zhihui [1 ]
Zhang, Qingyong [1 ]
Chang, Wanfeng [1 ]
Xiao, Peng [1 ]
Li, Minglong [1 ]
机构
[1] Wuhan Univ Technol, Sch Automat, Wuhan 430070, Peoples R China
来源
VEHICLES | 2024年 / 6卷 / 01期
关键词
traffic flow forecasting; Transformer; Multi-Head Efficient Self-Attention mechanism; Generative Decoding mechanism; PREDICTION; VOLUME;
D O I
10.3390/vehicles6010005
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Due to the regular influence of human activities, traffic flow data usually exhibit significant periodicity, which provides a foundation for further research on traffic flow data. However, the temporal dependencies in traffic flow data are often obscured by entangled temporal regularities, making it challenging for general models to capture the intrinsic functional relationships within the data accurately. In recent years, a plethora of methods based on statistics, machine learning, and deep learning have been proposed to tackle these problems of traffic flow forecasting. In this paper, the Transformer is improved from two aspects: (1) an Efficient Attention mechanism is proposed, which reduces the time and memory complexity of the Scaled Dot Product Attention; (2) a Generative Decoding mechanism instead of a Dynamic Decoding operation, which accelerates the inference speed of the model. The model is named EGFormer in this paper. Through a lot of experiments and comparative analysis, the authors found that the EGFormer has better ability in the traffic flow forecasting task. The new model has higher prediction accuracy and shorter running time compared with the traditional model.
引用
收藏
页码:120 / 139
页数:20
相关论文
共 50 条
  • [1] Traffic Flow Forecasting Based on Transformer with Diffusion Graph Attention Network
    Zhang, Hong
    Wang, Hongyan
    Chen, Linlong
    Zhao, Tianxin
    Kan, Sunan
    INTERNATIONAL JOURNAL OF AUTOMOTIVE TECHNOLOGY, 2024, 25 (03) : 455 - 468
  • [2] Traffic Flow Forecasting Based on Transformer with Diffusion Graph Attention Network
    Hong Zhang
    Hongyan Wang
    Linlong Chen
    Tianxin Zhao
    Sunan Kan
    International Journal of Automotive Technology, 2024, 25 : 455 - 468
  • [3] DGTNet:dynamic graph attention transformer network for traffic flow forecasting
    Chen, Jing
    Li, Wuzhi
    Chen, Shuixuan
    Zhang, Guowei
    ENGINEERING RESEARCH EXPRESS, 2024, 6 (04):
  • [4] Routeformer:Transformer utilizing routing mechanism for traffic flow forecasting
    Qi, Jun
    Fan, Hong
    NEUROCOMPUTING, 2025, 633
  • [5] A combined traffic flow forecasting model based on graph convolutional network and attention mechanism
    Zhang, Hong
    Chen, Linlong
    Cao, Jie
    Zhang, Xijun
    Kan, Sunan
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2021, 32 (12):
  • [6] Graph enhanced spatial-temporal transformer for traffic flow forecasting
    Kong, Weishan
    Ju, Yanni
    Zhang, Shiyuan
    Wang, Jun
    Huang, Liwei
    Qu, Hong
    APPLIED SOFT COMPUTING, 2025, 170
  • [7] Flow Transformer: A Novel Anonymity Network Traffic Classifier with Attention Mechanism
    Zhao, Ruijie
    Huang, Yiteng
    Deng, Xianwen
    Xue, Zhi
    Li, Jiabin
    Huang, Zijing
    Wang, Yijun
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 223 - 230
  • [8] Attention-based spatial-temporal graph transformer for traffic flow forecasting
    Zhang, Qingyong
    Chang, Wanfeng
    Li, Changwu
    Yin, Conghui
    Su, Yixin
    Xiao, Peng
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (29): : 21827 - 21839
  • [9] Attention-based spatial-temporal graph transformer for traffic flow forecasting
    Qingyong Zhang
    Wanfeng Chang
    Changwu Li
    Conghui Yin
    Yixin Su
    Peng Xiao
    Neural Computing and Applications, 2023, 35 : 21827 - 21839
  • [10] Traffic Flow Forecasting Model for Improved Spatio-Temporal Transformer
    Gao, Rong
    Wan, Yiliang
    Shao, Xiongkai
    Xinyun, Wu
    Computer Engineering and Applications, 2023, 59 (07) : 250 - 260