Triple-Stage Attention-Based Multiple Parallel Connection Hybrid Neural Network Model for Conditional Time Series Forecasting

被引:5
|
作者
Cheng, Yepeng [1 ]
Morimoto, Yasuhiko [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
关键词
Time series analysis; Forecasting; Recurrent neural networks; Feature extraction; Convolutional neural networks; Logic gates; Decoding; Encoder-decoder; hybrid neural networks; time series prediction; triple-stage attention;
D O I
10.1109/ACCESS.2021.3059861
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The attention-based SeriesNet (A-SeriesNet) combined augmented attention residual learning module-based convolutional neural network (augmented ARLM-CNN) subnetwork with hidden state attention module-based recurrent neural network (HSAM-RNN) subnetwork for conditional time series prediction with high accuracy. The augmented ARLM-CNN subnetwork has defects in extracting latent features of the multi-condition series. The forecasting accuracy will decrease when the feature dimension of the multi-condition series becomes high. The same problem also occurs in the HSAM-RNN subnetwork of A-SeriesNet. The dual-stage attention recurrent neural network (DA-RNN) proved that the attention-based encoder-decoder framework is an effective model for dealing with the above problem. This paper applies the DA-RNN to the HSAM-RNN subnetwork of A-SeriesNet and presents the triple-stage attention-based recurrent neural network (TA-RNN) subnetworks. Furthermore, this paper considers a CNN-based encoder-decoder structure named dual attention residual learning module-based convolutional neural network (DARLM-CNN) subnetwork to improve the augmented ARLM-CNN subnetwork of A-SeriesNet. Finally, this paper presents the triple-stage attention-based SeriesNet (TA-SeriesNet), which uses a new concatenation method instead of the element-wise multiplication of A-SeriesNet to parallel connect the proposed subnetworks and reduce the dependence of forecasting results on a certain subnetwork. The experimental results show our TA-SeriesNet is superior to other deep learning models in forecasting accuracy evaluation metrics for high feature dimensional time series datasets.
引用
收藏
页码:29165 / 29179
页数:15
相关论文
共 50 条
  • [1] Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting
    Cheng, Yepeng
    Liu, Zuren
    Morimoto, Yasuhiko
    [J]. INFORMATION, 2020, 11 (06)
  • [2] Prediction of chlorophyll-a data based on triple-stage attention recurrent neural network
    Chang, Wenqing
    Li, Xiang
    Chaudhary, Vikas
    Dong, Huomin
    Zhao, Zhigang
    Nguyen, Tri Gia
    [J]. IET COMMUNICATIONS, 2022,
  • [3] A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
    Qin, Yao
    Song, Dongjin
    Cheng, Haifeng
    Cheng, Wei
    Jiang, Guofei
    Cottrell, Garrison W.
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2627 - 2633
  • [4] Multivariate Time Series Classification With An Attention-Based Multivariate Convolutional Neural Network
    Tripathi, Achyut Mani
    Baruah, Rashmi Dutta
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [5] Time series forecasting model using a hybrid ARIMA and neural network
    Zou, Haofei
    Yang, Fangfing
    Xia, Guoping
    [J]. PROCEEDINGS OF THE 2005 CONFERENCE OF SYSTEM DYNAMICS AND MANAGEMENT SCIENCE, VOL 2: SUSTAINABLE DEVELOPMENT OF ASIA PACIFIC, 2005, : 934 - 939
  • [6] Deformation Forecasting using a Hybrid Time Series and Neural Network Model
    Wang, Qiang
    Gao, Ning
    Jiao, Wen Zhe
    Wang, Guan Jie
    [J]. ADVANCES IN CIVIL ENGINEERING II, PTS 1-4, 2013, 256-259 : 2343 - 2346
  • [7] Time series forecasting using a hybrid ARIMA and neural network model
    Zhang, GP
    [J]. NEUROCOMPUTING, 2003, 50 : 159 - 175
  • [8] Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
    Ozcan, Alper
    Catal, Cagatay
    Kasif, Ahmet
    [J]. SENSORS, 2021, 21 (21)
  • [9] An Attention-based Neural Network on Multiple Speaker Diarization
    Cheng, Shao Wen
    Hung, Kai Jyun
    Chang, Hsie Chia
    Liao, Yen Chin
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 431 - 434
  • [10] Attention-based LSTM network-assisted time series forecasting models for petroleum production
    Kumar, Indrajeet
    Tripathi, Bineet Kumar
    Singh, Anugrah
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 123