Attention-based Conv-LSTM and Bi-LSTM networks for large-scale traffic speed prediction

被引:28
|
作者
Hu, Xiaojian [1 ,2 ,3 ,4 ]
Liu, Tong [1 ]
Hao, Xiatong [1 ]
Lin, Chenxi [1 ]
机构
[1] Southeast Univ, Jiangsu Key Lab Urban ITS, Southeast Univ Rd 2, Nanjing 211189, Peoples R China
[2] Southeast Univ, Jiangsu Prov Collaborat Innovat Ctr Modern Urban, Nanjing, Peoples R China
[3] Southeast Univ, Natl Demonstrat Ctr Expt Rd & Traff Engn Educ, Nanjing 211189, Peoples R China
[4] Southeast Univ, Sch Transportat, Southeast Univ Rd 2, Nanjing 211189, Peoples R China
来源
JOURNAL OF SUPERCOMPUTING | 2022年 / 78卷 / 10期
关键词
Traffic speed prediction; Conv-LSTM; Bi-LSTM; Attention mechanism; Spatiotemporal; Periodic; NEURAL-NETWORK; FLOW; MODEL; VOLUME;
D O I
10.1007/s11227-022-04386-7
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Timely and accurate traffic speed prediction has gained increasing importance for urban traffic management and helping one to make advisable travel decision. However, the existing approaches have difficulty extracting features of large-scale traffic data. This study proposed a hybrid deep learning method named AB-ConvLSTM for large-scale traffic speed prediction. The proposed model consists of a convolutional-long short-term memory (Conv-LSTM) module, an attention mechanism module, and two bidirectional LSTM (Bi-LSTM) modules. Conv-LSTM networks are used to extract the spatiotemporal features of traffic speed data. In addition, the attention mechanism module is introduced to enhance the performance of Conv-LSTM by automatically capturing the importance of different historical periods to the final prediction and assigning corresponding weights. What's more, two Bi-LSTM networks are designed to extract daily and weekly periodic features and capture variation tendency from forward and backward traffic data. Experimental results carried out on urban road networks show that the proposed model consistently outperforms the competing models.
引用
收藏
页码:12686 / 12709
页数:24
相关论文
共 50 条
  • [31] Graph Attention Networks Adjusted Bi-LSTM for Video Summarization
    Zhong, Rui
    Wang, Rui
    Zou, Yang
    Hong, Zhiqiang
    Hu, Min
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2021, 28 (28) : 663 - 667
  • [32] A watershed water quality prediction model based on attention mechanism and Bi-LSTM
    Qiang Zhang
    Ruiqi Wang
    Ying Qi
    Fei Wen
    [J]. Environmental Science and Pollution Research, 2022, 29 : 75664 - 75680
  • [33] A watershed water quality prediction model based on attention mechanism and Bi-LSTM
    Zhang, Qiang
    Wang, Ruiqi
    Qi, Ying
    Wen, Fei
    [J]. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH, 2022, 29 (50) : 75664 - 75680
  • [34] Efficient and accurate TEC modeling and prediction approach with random forest and Bi-LSTM for large-scale region
    Jiang, Zixin
    Zhang, Zhetao
    He, Xiufeng
    Li, Yuan
    Yuan, Haijun
    [J]. ADVANCES IN SPACE RESEARCH, 2024, 73 (01) : 650 - 662
  • [35] Distributed Fine-Grained Traffic Speed Prediction for Large-Scale Transportation Networks Based on Automatic LSTM Customization and Sharing
    Lee, Ming-Chang
    Lin, Jia-Chun
    Gran, Ernst Gunnar
    [J]. EURO-PAR 2020: PARALLEL PROCESSING, 2020, 12247 : 234 - 247
  • [36] An Improved Generating Energy Prediction Method Based on Bi-LSTM and Attention Mechanism
    He, Bo
    Ma, Runze
    Zhang, Wenwei
    Zhu, Jun
    Zhang, Xingyuan
    [J]. ELECTRONICS, 2022, 11 (12)
  • [37] Time series data recovery in SHM of large-scale bridges: Leveraging GAN and Bi-LSTM networks
    Tien, Thanh Bui
    Quang, Tuyen Vu
    Ngoc, Lan Nguyen
    Ngoc, Hoa Tran
    [J]. STRUCTURES, 2024, 63
  • [38] Enhancer-LSTMAtt: A Bi-LSTM and Attention-Based Deep Learning Method for Enhancer Recognition
    Huang, Guohua
    Luo, Wei
    Zhang, Guiyang
    Zheng, Peijie
    Yao, Yuhua
    Lyu, Jianyi
    Liu, Yuewu
    Wei, Dong-Qing
    [J]. BIOMOLECULES, 2022, 12 (07)
  • [39] EA-LSTM: Evolutionary attention-based LSTM for time series prediction
    Li, Youru
    Zhu, Zhenfeng
    Kong, Deqiang
    Han, Hua
    Zhao, Yao
    [J]. KNOWLEDGE-BASED SYSTEMS, 2019, 181
  • [40] Entity Relationship Extraction Based on Bi-LSTM and Attention Mechanism
    Wei, Ming
    Xu, Zhipeng
    Hu, Jiwei
    [J]. PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,