Weight estimation model for trucks integrating multi-head attention mechanism

被引:0
|
作者
Gu, Ming-Chen [1 ,2 ]
Xiong, Hui-Yuan [1 ,2 ]
Liu, Zeng-Jun [1 ,2 ]
Luo, Qing-Yu [3 ]
Liu, Hong [1 ,2 ]
机构
[1] Transport Planning and Research Institute, Ministry of Transport, Beijing,100028, China
[2] Laboratory for Traffic and Transport Planning Digitalization, Beijing,100028, China
[3] College of Transportation, Jilin University, Changchun,130022, China
关键词
Trucks;
D O I
10.13229/j.cnki.jdxbgxb.20230297
中图分类号
学科分类号
摘要
In order to improve the real-time truck load estimation of convenience and accuracy, and to help to develop real-time monitoring for truck loads in a large-scale low-grade highway networks, this paper proposed a weight estimation model(Mix-MAN)for trucks integrating the multi-head attention mechanism based on the interaction effect of dynamic and static information of trucks. Firstly, multi-head attention was introduced into the model to enhance the network's ability to extract kinematic time series features;secondly, a stacked auto-encoder was used to capture the static features of trucks; finally, a feature fusion structure was designed to extract dynamic features and static features, establish the nonlinear mapping relationship between input features and weight estimation, and then to obtain the final weight estimation result of trucks. The experimental results show that compared with the MAN model without considering static information of truck, the mean absolute value error of Mix-MAN is reduced by 6%, root mean square error is reduced by 5%, mean absolute percentage error is reduced by 0.5%. The model in this paper can provide technical support for the cargo transport supervision of highway and road maintenance. © 2024 Editorial Board of Jilin University. All rights reserved.
引用
收藏
页码:2771 / 2780
相关论文
共 50 条
  • [1] Click-through rate prediction model integrating user interest and multi-head attention mechanism
    Zhang, Wei
    Han, Yahui
    Yi, Baolin
    Zhang, Zhaoli
    [J]. JOURNAL OF BIG DATA, 2023, 10 (01)
  • [2] Click-through rate prediction model integrating user interest and multi-head attention mechanism
    Wei Zhang
    Yahui Han
    Baolin Yi
    Zhaoli Zhang
    [J]. Journal of Big Data, 10
  • [3] Machine Reading Comprehension Model Based on Multi-head Attention Mechanism
    Xue, Yong
    [J]. ADVANCED INTELLIGENT TECHNOLOGIES FOR INDUSTRY, 2022, 285 : 45 - 58
  • [4] A Graph Neural Network Social Recommendation Algorithm Integrating the Multi-Head Attention Mechanism
    Yi, Huawei
    Liu, Jingtong
    Xu, Wenqian
    Li, Xiaohui
    Qian, Huihui
    [J]. ELECTRONICS, 2023, 12 (06)
  • [5] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    [J]. ELECTRONICS, 2023, 12 (19)
  • [6] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    [J]. NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [7] Enhancing Air Quality Forecasting: A Novel Spatio-Temporal Model Integrating Graph Convolution and Multi-Head Attention Mechanism
    Wang, Yumeng
    Liu, Ke
    He, Yuejun
    Wang, Pengfei
    Chen, Yuxin
    Xue, Hang
    Huang, Caiyi
    Li, Lin
    [J]. ATMOSPHERE, 2024, 15 (04)
  • [8] A fiber recognition framework based on multi-head attention mechanism
    Xu, Luoli
    Li, Fenying
    Chang, Shan
    [J]. TEXTILE RESEARCH JOURNAL, 2024, : 2629 - 2640
  • [9] A temporal prediction model for ship maneuvering motion based on multi-head attention mechanism
    Dong, Lei
    Wang, Hongdong
    Lou, Jiankun
    [J]. OCEAN ENGINEERING, 2024, 309
  • [10] Leveraging Multi-head Attention Mechanism to Improve Event Detection
    Tong, Meihan
    Xu, Bin
    Hou, Lei
    Li, Juanzi
    Wang, Shuai
    [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 268 - 280