Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting

被引:4
|
作者
Cheng, Yepeng [1 ]
Liu, Zuren [1 ]
Morimoto, Yasuhiko [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
关键词
attention; convolutional neural network; recurrent neural network;
D O I
10.3390/info11060305
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Traditional time series forecasting techniques can not extract good enough sequence data features, and their accuracies are limited. The deep learning structure SeriesNet is an advanced method, which adopts hybrid neural networks, including dilated causal convolutional neural network (DC-CNN) and Long-short term memory recurrent neural network (LSTM-RNN), to learn multi-range and multi-level features from multi-conditional time series with higher accuracy. However, they didn't consider the attention mechanisms to learn temporal features. Besides, the conditioning method for CNN and RNN is not specific, and the number of parameters in each layer is tremendous. This paper proposes the conditioning method for two types of neural networks, and respectively uses the gated recurrent unit network (GRU) and the dilated depthwise separable temporal convolutional networks (DDSTCNs) instead of LSTM and DC-CNN for reducing the parameters. Furthermore, this paper presents the lightweight RNN-based hidden state attention module (HSAM) combined with the proposed CNN-based convolutional block attention module (CBAM) for time series forecasting. Experimental results show our model is superior to other models from the viewpoint of forecasting accuracy and computation efficiency.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Triple-Stage Attention-Based Multiple Parallel Connection Hybrid Neural Network Model for Conditional Time Series Forecasting
    Cheng, Yepeng
    Morimoto, Yasuhiko
    [J]. IEEE ACCESS, 2021, 9 : 29165 - 29179
  • [2] Multivariate Time Series Classification With An Attention-Based Multivariate Convolutional Neural Network
    Tripathi, Achyut Mani
    Baruah, Rashmi Dutta
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [3] AHNN: An Attention-Based Hybrid Neural Network for Sentence Modeling
    Zhang, Xiaomin
    Huang, Li
    Qu, Hong
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 731 - 740
  • [4] A Unifying Framework of Attention-Based Neural Load Forecasting
    Xiong, Jing
    Zhang, Yu
    [J]. IEEE ACCESS, 2023, 11 : 51606 - 51616
  • [5] An Attention-Based Convolutional Neural Network for Intrusion Detection Model
    Wang, Zhen
    Ghaleb, Fuad A. A.
    [J]. IEEE ACCESS, 2023, 11 : 43116 - 43127
  • [6] Attention-based deep survival model for time series data
    Li, Xingyu
    Krivtsov, Vasiliy
    Arora, Karunesh
    [J]. RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 217
  • [7] A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
    Qin, Yao
    Song, Dongjin
    Cheng, Haifeng
    Cheng, Wei
    Jiang, Guofei
    Cottrell, Garrison W.
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2627 - 2633
  • [8] An Attention-Based Forecasting Network for Intelligent Services in Manufacturing
    Zhou, Xinyi
    Gao, Xiaofeng
    [J]. SERVICE-ORIENTED COMPUTING (ICSOC 2021), 2021, 13121 : 900 - 914
  • [9] Product quality time series prediction with attention-based convolutional recurrent neural network
    Shi, Yiguan
    Chen, Yong
    Zhang, Longjie
    [J]. APPLIED INTELLIGENCE, 2024, 54 (21) : 10763 - 10779
  • [10] AHRNN: Attention-Based Hybrid Robust Neural Network for emotion recognition
    Xu, Ke
    Liu, Bin
    Tao, Jianhua
    Lv, Zhao
    Fan, Cunhang
    Song, Leichao
    [J]. COGNITIVE COMPUTATION AND SYSTEMS, 2022, 4 (01) : 85 - 95