Temporal self-attention-based Conv-LSTM network for multivariate time series prediction

被引:44
|
作者
Fu, En [1 ]
Zhang, Yinong [2 ]
Yang, Fan [3 ]
Wang, Shuying [2 ]
机构
[1] Beijing Union Univ, Beijing Key Lab Informat Serv Engn, Beijing 100101, Peoples R China
[2] Beijing Union Univ, Coll Urban Rail Transit & Logist, Beijing 100101, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
关键词
Self-attention mechanism; Long short-term memory; Multivariate time series; Prediction;
D O I
10.1016/j.neucom.2022.06.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series play an important role in many fields, such as industrial control, automated monitoring, and weather forecasting. Because there is often more than one variable in reality problems and they are related to each other, the multivariable time series (MTS) introduced. Using historical observations to accurately predict MTS is still very challenging. Therefore, a new time series prediction model proposed based on the temporal self-attention mechanism, convolutional neural network and long short-term memory (Conv-LSTM). When the standard attention mechanism for time series is combined with recurrent neural network (RNN), it heavily depends on the hidden state of the RNN. Particularly in the first time step, the initial hidden state (typically 0) must be artificially introduced to calculate the attention weight of that step, which results in additional noise in the calculation of the attention weight. To address this problem and increase the flexibility of the attention layer, a new self-attention mechanism designed to extract the temporal dependence of the MTS, which called temporal self-attention. In this attention mechanism, long short-term memory (LSTM) adopted as a sequence encoder to calculate the query, key, and value to obtain a more complete temporal dependence than standard self-attention. Because of flexibility of this structure, the DA-Conv-LSTM model was improved, in which a SOTA attention based method used for MTS prediction. Our improved model compared with six baseline models on multiple datasets (SML2010 and NASDAQ100), and applied to satellite state prediction (our private dataset). The effectiveness of our temporal self-attention was demonstrated by experiments. And the best shortterm prediction performance was achieved by our improved model.(c) 2022 Elsevier B.V. All rights reserved.
引用
下载
收藏
页码:162 / 173
页数:12
相关论文
共 50 条
  • [21] Dual-branch cross-dimensional self-attention-based imputation model for multivariate time series
    Fang, Le
    Xiang, Wei
    Zhou, Yuan
    Fang, Juan
    Chi, Lianhua
    Ge, Zongyuan
    KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [22] Hydrological Time Series Prediction Model Based on Attention-LSTM Neural Network
    Li, Yiran
    Yang, Juan
    PROCEEDINGS OF THE 2019 2ND INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2019), 2019, : 21 - 25
  • [23] Temporal Tensor Transformation Network for Multivariate Time Series Prediction
    Ong, Yuya Jeremy
    Qiao, Mu
    Jadav, Divyesh
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 1594 - 1603
  • [24] Multi-scale temporal features extraction based graph convolutional network with attention for multivariate time series prediction
    Chen, Yawen
    Ding, Fengqian
    Zhai, Linbo
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 200
  • [25] Deep Learning with Spatial Attention-Based CONV-LSTM for SOC Estimation of Lithium-Ion Batteries
    Tian, Huixin
    Chen, Jianhua
    PROCESSES, 2022, 10 (11)
  • [26] A Multivariate Temporal Convolutional Attention Network for Time-Series Forecasting
    Wan, Renzhuo
    Tian, Chengde
    Zhang, Wei
    Deng, Wendi
    Yang, Fan
    ELECTRONICS, 2022, 11 (10)
  • [27] Hierarchical multimodal self-attention-based graph neural network for DTI prediction
    Bian, Jilong
    Lu, Hao
    Dong, Guanghui
    Wang, Guohua
    BRIEFINGS IN BIOINFORMATICS, 2024, 25 (04)
  • [28] Parallel spatio-temporal attention-based TCN for multivariate time series prediction
    Fan, Jin
    Zhang, Ke
    Huang, Yipan
    Zhu, Yifei
    Chen, Baiping
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (18): : 13109 - 13118
  • [29] Parallel spatio-temporal attention-based TCN for multivariate time series prediction
    Jin Fan
    Ke Zhang
    Yipan Huang
    Yifei Zhu
    Baiping Chen
    Neural Computing and Applications, 2023, 35 : 13109 - 13118
  • [30] EA-LSTM: Evolutionary attention-based LSTM for time series prediction
    Li, Youru
    Zhu, Zhenfeng
    Kong, Deqiang
    Han, Hua
    Zhao, Yao
    KNOWLEDGE-BASED SYSTEMS, 2019, 181