Multi-type parameter prediction of traffic flow based on Time-space attention graph convolutional network

被引:0
|
作者
Zhang G. [1 ]
Wang H. [2 ]
Yin Y. [2 ]
机构
[1] Baidu Inc, Autonomous Driving Unit (ADU), Beijing
[2] Inner Mongolia Agricultural University, College of Energy and Transportation Engineering, Hohhot
关键词
Convolutional Neural Network; Prediction; Spatiotemporal attention graph; Traffic flow;
D O I
10.46300/9106.2021.15.97
中图分类号
学科分类号
摘要
Graph Convolutional Neural Networks are more and more widely used in traffic flow parameter prediction tasks by virtue of their excellent non-Euclidean spatial feature extraction capabilities. However, most graph convolutional neural networks are only used to predict one type of traffic flow parameter. This means that the proposed graph convolutional neural network may only be effective for specific parameters of specific travel modes. In order to improve the universality of graph convolutional neural networks. By embedding time feature and spatio-temporal attention layer, we propose a spatio-temporal attention graph convolutional neural network based on the attention mechanism of the neural network. Through experiments on passenger flow data and vehicle speed data of two different travel modes (Hangzhou Metro Data and California Highway Data), it is verified that the proposed spatio-temporal attention graph convolutional neural network can be used to predict passenger flow and vehicle speed simultaneously. Meanwhile, the error distribution range of the proposed model is minimum, and the overall level of prediction results is more accurate. © 2021, North Atlantic University Union NAUN. All rights reserved.
引用
下载
收藏
页码:902 / 912
页数:10
相关论文
共 50 条
  • [1] Traffic Flow Prediction Based on Multi-type Characteristic Hybrid Graph Neural Network
    Wang, Yuhang
    Gao, Hui
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT V, 2023, 14258 : 486 - 497
  • [2] Short-term passenger flow prediction for urban rail transit based on Time-space attention graph convolutional network
    Zhang, Guoxing
    Liu, Wei
    Zheng, Hao
    Ma, Tianyi
    2020 5TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE, COMPUTER TECHNOLOGY AND TRANSPORTATION (ISCTT 2020), 2020, : 548 - 553
  • [3] Traffic Flow Prediction Model Based on Attention Spatiotemporal Graph Convolutional Network
    Sun, HongXian
    2023 3rd International Symposium on Computer Technology and Information Science, ISCTIS 2023, 2023, : 148 - 153
  • [4] AFTGAN: prediction of multi-type PPI based on attention free transformer and graph attention network
    Kang, Yanlei
    Elofsson, Arne
    Jiang, Yunliang
    Huang, Weihong
    Yu, Minzhe
    Li, Zhong
    BIOINFORMATICS, 2023, 39 (02)
  • [5] Gated Recurrent Graph Convolutional Attention Network for Traffic Flow Prediction
    Feng, Xiaoyuan
    Chen, Yue
    Li, Hongbo
    Ma, Tian
    Ren, Yilong
    SUSTAINABILITY, 2023, 15 (09)
  • [6] MFAGCN: Multi-Feature Based Attention Graph Convolutional Network for Traffic Prediction
    Li, Haoran
    Li, Jianbo
    Lv, Zhiqiang
    Xu, Zhihao
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 227 - 239
  • [7] Attention-based Bicomponent Synchronous Graph Convolutional Network for traffic flow prediction
    Shen, Cheng
    Han, Kai
    Bi, Tianyuan
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 778 - 785
  • [8] GECRAN: Graph embedding based convolutional recurrent attention network for traffic flow prediction
    Yan, Jianqiang
    Zhang, Lin
    Gao, Yuan
    Qu, Boting
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [9] Residual attention enhanced Time-varying Multi-Factor Graph Convolutional Network for traffic flow prediction
    Bao, Yinxin
    Shen, Qinqin
    Cao, Yang
    Ding, Weiping
    Shi, Quan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [10] Traffic Prediction using Time-Space Diagram: A Convolutional Neural Network Approach
    Hosseini, Mohammadreza Khajeh
    Talebpour, Alireza
    TRANSPORTATION RESEARCH RECORD, 2019, 2673 (07) : 425 - 435