Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting

被引:4
|
作者
Cheng, Yepeng [1 ]
Liu, Zuren [1 ]
Morimoto, Yasuhiko [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
关键词
attention; convolutional neural network; recurrent neural network;
D O I
10.3390/info11060305
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Traditional time series forecasting techniques can not extract good enough sequence data features, and their accuracies are limited. The deep learning structure SeriesNet is an advanced method, which adopts hybrid neural networks, including dilated causal convolutional neural network (DC-CNN) and Long-short term memory recurrent neural network (LSTM-RNN), to learn multi-range and multi-level features from multi-conditional time series with higher accuracy. However, they didn't consider the attention mechanisms to learn temporal features. Besides, the conditioning method for CNN and RNN is not specific, and the number of parameters in each layer is tremendous. This paper proposes the conditioning method for two types of neural networks, and respectively uses the gated recurrent unit network (GRU) and the dilated depthwise separable temporal convolutional networks (DDSTCNs) instead of LSTM and DC-CNN for reducing the parameters. Furthermore, this paper presents the lightweight RNN-based hidden state attention module (HSAM) combined with the proposed CNN-based convolutional block attention module (CBAM) for time series forecasting. Experimental results show our model is superior to other models from the viewpoint of forecasting accuracy and computation efficiency.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] An Attention-Based Friend Recommendation Model in Social Network
    Cai, Chongchao
    Xu, Huahu
    Wan, Jie
    Zhou, Baiqing
    Xie, Xiongwei
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2020, 65 (03): : 2475 - 2488
  • [42] Attention-based deep neural network for driver behavior recognition
    Xiao, Weichu
    Liu, Hongli
    Ma, Ziji
    Chen, Weihong
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 132 : 152 - 161
  • [43] Attention-based Neural Load Forecasting: A Dynamic Feature Selection Approach
    Xiong, Jing
    Zhou, Pengyang
    Chen, Alan
    Zhang, Yu
    [J]. 2021 IEEE POWER & ENERGY SOCIETY GENERAL MEETING (PESGM), 2021,
  • [44] Attention-Based Convolutional Neural Network for Earthquake Event Classification
    Ku, Bonhwa
    Kim, Gwantae
    Ahn, Jae-Kwang
    Lee, Jimin
    Ko, Hanseok
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2021, 18 (12) : 2057 - 2061
  • [45] Attention-based convolutional neural network for deep face recognition
    Hefei Ling
    Jiyang Wu
    Junrui Huang
    Jiazhong Chen
    Ping Li
    [J]. Multimedia Tools and Applications, 2020, 79 : 5595 - 5616
  • [46] Attention-based convolutional neural network for Bangla sentiment analysis
    Sadia Sharmin
    Danial Chakma
    [J]. AI & SOCIETY, 2021, 36 : 381 - 396
  • [47] Attention-based recurrent neural network for influenza epidemic prediction
    Zhu, Xianglei
    Fu, Bofeng
    Yang, Yaodong
    Ma, Yu
    Hao, Jianye
    Chen, Siqi
    Liu, Shuang
    Li, Tiegang
    Liu, Sen
    Guo, Weiming
    Liao, Zhenyu
    [J]. BMC BIOINFORMATICS, 2019, 20 (Suppl 18)
  • [48] Metro Passenger Flow Prediction Model Using Attention-Based Neural Network
    Yang, Jun
    Dong, Xuchen
    Jin, Shangtai
    [J]. IEEE ACCESS, 2020, 8 : 30953 - 30959
  • [49] Attention-based novel neural network for mixed frequency data
    Li, Xiangpeng
    Yu, Hong
    Xie, Yongfang
    Li, Jie
    [J]. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2021, 6 (03) : 301 - 311
  • [50] ATTENTION-BASED NEURAL NETWORK FOR JOINT DIARIZATION AND SPEAKER EXTRACTION
    Chazan, Shlomo E.
    Gannot, Sharon
    Goldberger, Jacob
    [J]. 2018 16TH INTERNATIONAL WORKSHOP ON ACOUSTIC SIGNAL ENHANCEMENT (IWAENC), 2018, : 301 - 305