Temporal self-attention-based Conv-LSTM network for multivariate time series prediction

被引:44
|
作者
Fu, En [1 ]
Zhang, Yinong [2 ]
Yang, Fan [3 ]
Wang, Shuying [2 ]
机构
[1] Beijing Union Univ, Beijing Key Lab Informat Serv Engn, Beijing 100101, Peoples R China
[2] Beijing Union Univ, Coll Urban Rail Transit & Logist, Beijing 100101, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
关键词
Self-attention mechanism; Long short-term memory; Multivariate time series; Prediction;
D O I
10.1016/j.neucom.2022.06.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series play an important role in many fields, such as industrial control, automated monitoring, and weather forecasting. Because there is often more than one variable in reality problems and they are related to each other, the multivariable time series (MTS) introduced. Using historical observations to accurately predict MTS is still very challenging. Therefore, a new time series prediction model proposed based on the temporal self-attention mechanism, convolutional neural network and long short-term memory (Conv-LSTM). When the standard attention mechanism for time series is combined with recurrent neural network (RNN), it heavily depends on the hidden state of the RNN. Particularly in the first time step, the initial hidden state (typically 0) must be artificially introduced to calculate the attention weight of that step, which results in additional noise in the calculation of the attention weight. To address this problem and increase the flexibility of the attention layer, a new self-attention mechanism designed to extract the temporal dependence of the MTS, which called temporal self-attention. In this attention mechanism, long short-term memory (LSTM) adopted as a sequence encoder to calculate the query, key, and value to obtain a more complete temporal dependence than standard self-attention. Because of flexibility of this structure, the DA-Conv-LSTM model was improved, in which a SOTA attention based method used for MTS prediction. Our improved model compared with six baseline models on multiple datasets (SML2010 and NASDAQ100), and applied to satellite state prediction (our private dataset). The effectiveness of our temporal self-attention was demonstrated by experiments. And the best shortterm prediction performance was achieved by our improved model.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:162 / 173
页数:12
相关论文
共 50 条
  • [1] A dual-stage attention-based Conv-LSTM network for spatio-temporal correlation and multivariate time series prediction
    Xiao, Yuteng
    Yin, Hongsheng
    Zhang, Yudong
    Qi, Honggang
    Zhang, Yundong
    Liu, Zhaoyang
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (05) : 2036 - 2057
  • [2] TCLN: A Transformer-based Conv-LSTM network for multivariate time series forecasting
    Shusen Ma
    Tianhao Zhang
    Yun-Bo Zhao
    Yu Kang
    Peng Bai
    [J]. Applied Intelligence, 2023, 53 : 28401 - 28417
  • [3] TCLN: A Transformer-based Conv-LSTM network for multivariate time series forecasting
    Ma, Shusen
    Zhang, Tianhao
    Zhao, Yun-Bo
    Kang, Yu
    Bai, Peng
    [J]. APPLIED INTELLIGENCE, 2023, 53 (23) : 28401 - 28417
  • [4] Spatiotemporal Self-Attention-Based LSTNet for Multivariate Time Series Prediction
    Wang, Dezheng
    Chen, Congyan
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023
  • [5] Interpreting Conv-LSTM for Spatio-Temporal Soil Moisture Prediction in China
    Huang, Feini
    Zhang, Yongkun
    Zhang, Ye
    Wei, Shangguan
    Li, Qingliang
    Li, Lu
    Jiang, Shijie
    [J]. AGRICULTURE-BASEL, 2023, 13 (05):
  • [6] Attention-based Conv-LSTM and Bi-LSTM networks for large-scale traffic speed prediction
    Xiaojian Hu
    Tong Liu
    Xiatong Hao
    Chenxi Lin
    [J]. The Journal of Supercomputing, 2022, 78 : 12686 - 12709
  • [7] Attention-based Conv-LSTM and Bi-LSTM networks for large-scale traffic speed prediction
    Hu, Xiaojian
    Liu, Tong
    Hao, Xiatong
    Lin, Chenxi
    [J]. JOURNAL OF SUPERCOMPUTING, 2022, 78 (10): : 12686 - 12709
  • [8] SAITS: Self-attention-based imputation for time series
    Du, Wenjie
    Cote, David
    Liu, Yan
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 219
  • [9] Self-Attention-Based Multivariate Anomaly Detection for CPS Time Series Data with Adversarial Autoencoders
    Li, Qiwen
    Yan, Tijin
    Yuan, Huanhuan
    Xia, Yuanqing
    [J]. 2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 4251 - 4256
  • [10] A dual-stage attention-based Bi-LSTM network for multivariate time series prediction
    Cheng, Qi
    Chen, Yixin
    Xiao, Yuteng
    Yin, Hongsheng
    Liu, Weidong
    [J]. JOURNAL OF SUPERCOMPUTING, 2022, 78 (14): : 16214 - 16235