MAGRU: Multi-layer Attention with GRU for Logistics Warehousing Demand Prediction

被引:1
|
作者
Tian, Ran [1 ]
Wang, Bo [1 ]
Wang, Chu [1 ]
机构
[1] Northwest Normal Univ, Coll Comp Sci & Engn, Lanzhou 730070, Peoples R China
基金
中国国家自然科学基金;
关键词
Gated Recurrent Unit; Attention mechanism; Commodities demand forecast; Time series; Neural Network; MODEL;
D O I
10.3837/tiis.2024.03.001
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Warehousing demand prediction is an essential part of the supply chain, providing a fundamental basis for product manufacturing, replenishment, warehouse planning, etc. Existing forecasting methods cannot produce accurate forecasts since warehouse demand is affected by external factors such as holidays and seasons. Some aspects, such as consumer psychology and producer reputation, are challenging to quantify. The data can fluctuate widely or do not show obvious trend cycles. We introduce a new model for warehouse demand prediction called MAGRU, which stands for Multi-layer Attention with GRU. In the model, firstly, we perform the embedding operation on the input sequence to quantify the external influences; after that, we implement an encoder using GRU and the attention mechanism. The hidden state of GRU captures essential time series. In the decoder, we use attention again to select the key hidden states among all-time slices as the data to be fed into the GRU network. Experimental results show that this model has higher accuracy than RNN, LSTM, GRU, Prophet, XGboost, and DARNN. Using mean absolute error (MAE) and symmetric mean absolute percentage error(SMAPE) to evaluate the experimental results, MAGRU's MAE, RMSE, and SMAPE decreased by 7.65%, 10.03%, and 8.87% over GRU-LSTM, the current best model for solving this type of problem.
引用
收藏
页码:528 / 550
页数:23
相关论文
共 50 条
  • [21] Machine remaining life prediction based on multi-layer self-attention and temporal convolution network
    Shang, Zhiwu
    Zhang, Baoren
    Li, Wanxiang
    Qian, Shiqi
    Zhang, Jie
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (02) : 1409 - 1424
  • [22] Machine remaining life prediction based on multi-layer self-attention and temporal convolution network
    Zhiwu Shang
    Baoren Zhang
    Wanxiang Li
    Shiqi Qian
    Jie Zhang
    [J]. Complex & Intelligent Systems, 2022, 8 : 1409 - 1424
  • [23] An Effective Multi-Layer Attention Network for SAR Ship Detection
    Suo, Zhiling
    Zhao, Yongbo
    Hu, Yili
    [J]. JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (05)
  • [24] A dynamic multi-layer semantics perceptron without attention mechanism
    Liu, Xiao-Yan
    Tang, Huan-Ling
    Wang, Yu-Lin
    Dou, Quan-Sheng
    Lu, Ming-Yu
    [J]. Kongzhi yu Juece/Control and Decision, 2024, 39 (02): : 588 - 594
  • [25] Multi-layer convolutional features with channel attention for object tracking
    Yan L.
    Dong M.
    [J]. Huazhong Keji Daxue Xuebao (Ziran Kexue Ban)/Journal of Huazhong University of Science and Technology (Natural Science Edition), 2019, 47 (09): : 90 - 94
  • [26] Attention-based Multi-layer Chinese Word Embedding
    Ma, Bing
    Sun, Haifeng
    Wang, Jingyu
    Qi, Qi
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 2895 - 2902
  • [27] Multi-Layer Perceptron Model for Air Quality Prediction
    Abdullah, S.
    Ismail, M.
    Ahmed, A. N.
    [J]. MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES, 2019, 13 : 85 - 95
  • [28] Selective Tensorized Multi-layer LSTM for Orbit Prediction
    Shin, Youjin
    Park, Eun-Ju
    Woo, Simon S.
    Jung, Okchul
    Chung, Daewon
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 3495 - 3504
  • [29] Prediction of zenith tropospheric delay by multi-layer perceptron
    Katsougiannopoulos, S.
    Pikridas, C.
    [J]. JOURNAL OF APPLIED GEODESY, 2009, 3 (04) : 223 - 229
  • [30] Serialized Multi-Layer Multi-Head Attention for Neural Speaker Embedding
    Zhu, Hongning
    Lee, Kong Aik
    Li, Haizhou
    [J]. INTERSPEECH 2021, 2021, : 106 - 110