Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting

被引:4
|
作者
Cheng, Yepeng [1 ]
Liu, Zuren [1 ]
Morimoto, Yasuhiko [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
关键词
attention; convolutional neural network; recurrent neural network;
D O I
10.3390/info11060305
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Traditional time series forecasting techniques can not extract good enough sequence data features, and their accuracies are limited. The deep learning structure SeriesNet is an advanced method, which adopts hybrid neural networks, including dilated causal convolutional neural network (DC-CNN) and Long-short term memory recurrent neural network (LSTM-RNN), to learn multi-range and multi-level features from multi-conditional time series with higher accuracy. However, they didn't consider the attention mechanisms to learn temporal features. Besides, the conditioning method for CNN and RNN is not specific, and the number of parameters in each layer is tremendous. This paper proposes the conditioning method for two types of neural networks, and respectively uses the gated recurrent unit network (GRU) and the dilated depthwise separable temporal convolutional networks (DDSTCNs) instead of LSTM and DC-CNN for reducing the parameters. Furthermore, this paper presents the lightweight RNN-based hidden state attention module (HSAM) combined with the proposed CNN-based convolutional block attention module (CBAM) for time series forecasting. Experimental results show our model is superior to other models from the viewpoint of forecasting accuracy and computation efficiency.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Attention-Based Sequence-to-Sequence Model for Time Series Imputation
    Li, Yurui
    Du, Mingjing
    He, Sheng
    [J]. ENTROPY, 2022, 24 (12)
  • [32] Forecasting of Tomato Yields Using Attention-Based LSTM Network and ARMA Model
    Cho, Wanhyun
    Kim, Sangkyuoon
    Na, Myunghwan
    Na, Inseop
    [J]. ELECTRONICS, 2021, 10 (13)
  • [33] Attention-Based Neural Text Segmentation
    Badjatiya, Pinkesh
    Kurisinkel, Litton J.
    Gupta, Manish
    Varma, Vasudeva
    [J]. ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 180 - 193
  • [34] Hybrid model of a cement rotary kiln using an improved attention-based recurrent neural network
    Zheng, Jinquan
    Zhao, Liang
    Du, Wenli
    [J]. ISA TRANSACTIONS, 2022, 129 : 631 - 643
  • [35] Attention-Based Neural Tag Recommendation
    Yuan, Jiahao
    Jin, Yuanyuan
    Liu, Wenyan
    Wang, Xiaoling
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2019), PT II, 2019, 11447 : 350 - 365
  • [36] Neutron: an attention-based neural decompiler
    Ruigang Liang
    Ying Cao
    Peiwei Hu
    Kai Chen
    [J]. Cybersecurity, 4
  • [37] Neutron: an attention-based neural decompiler
    Liang, Ruigang
    Cao, Ying
    Hu, Peiwei
    Chen, Kai
    [J]. CYBERSECURITY, 2021, 4 (01)
  • [38] An Attention-Based Air Quality Forecasting Method
    Liu, Bo
    Yan, Shuo
    Li, Jianqiang
    Qu, Guangzhi
    Li, Yong
    Lang, Jianlei
    Gu, Rentao
    [J]. 2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 728 - 733
  • [39] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    [J]. NEUROCOMPUTING, 2020, 388 : 269 - 279
  • [40] Attention-Based Load Forecasting with Bidirectional Finetuning
    Kamalov, Firuz
    Zicmane, Inga
    Safaraliev, Murodbek
    Smail, Linda
    Senyuk, Mihail
    Matrenin, Pavel
    [J]. ENERGIES, 2024, 17 (18)