Self-attention-based time-variant neural networks for multi-step time series forecasting

被引:0
|
作者
Changxia Gao
Ning Zhang
Youru Li
Feng Bian
Huaiyu Wan
机构
[1] Beijing Jiaotong University,School of Computer and Information Technology
[2] China Engineering Research Center of Network Management Technology for High Speed Railway of MOE,College of Information Science and Technology
[3] Beijing Normal University,undefined
[4] Beijing Key Laboratory of Traffic Data Analysis and Mining,undefined
[5] Beijing Key Laboratory of Advanced Information Science and Network Technology,undefined
来源
关键词
Multi-step time series forecasting; Self-attention; Time variant; Recent changes in data; Different scales changes in data;
D O I
暂无
中图分类号
学科分类号
摘要
Time series forecasting is ubiquitous in various scientific and industrial domains. Powered by recurrent and convolutional and self-attention mechanism, deep learning exhibits high efficacy in time series forecasting. However, the existing forecasting methods are suffering some limitations. For example, recurrent neural networks are limited by the gradient vanishing problem, convolutional neural networks cost more parameters, and self-attention has a defect in capturing local dependencies. What’s more, they all rely on time invariant or stationary since they leverage parameter sharing by repeating a set of fixed architectures with fixed parameters over time or space. To address the above issues, in this paper we propose a novel time-variant framework named Self-Attention-based Time-Variant Neural Networks (SATVNN), generally capable of capturing dynamic changes of time series on different scales more accurately with its time-variant structure and consisting of self-attention blocks that seek to better capture the dynamic changes of recent data, with the help of Gaussian distribution, Laplace distribution and a novel Cauchy distribution, respectively. SATVNN obviously outperforms the classical time series prediction methods and the state-of-the-art deep learning models on lots of widely used real-world datasets.
引用
收藏
页码:8737 / 8754
页数:17
相关论文
共 50 条
  • [1] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Bian, Feng
    Wan, Huaiyu
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8737 - 8754
  • [2] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao Changxia
    Zhang Ning
    Li Youru
    Lin Yan
    Wan Huaiyu
    [J]. Applied Intelligence, 2023, 53 : 28974 - 28993
  • [3] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    [J]. APPLIED INTELLIGENCE, 2023, 53 (23) : 28974 - 28993
  • [4] Adversarial self-attentive time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 231
  • [5] Robustness of LSTM neural networks for multi-step forecasting of chaotic time series
    Sangiorgio, Matteo
    Dercole, Fabio
    [J]. CHAOS SOLITONS & FRACTALS, 2020, 139
  • [6] A multi-step forecasting method of time series
    Zhou, JB
    Wang, YK
    Yang, GY
    Zhao, YL
    [J]. 13TH CONFERENCE ON PROBABILITY AND STATISTICS IN THE ATMOSPHERIC SCIENCES, 1996, : 361 - 362
  • [7] SAITS: Self-attention-based imputation for time series
    Du, Wenjie
    Cote, David
    Liu, Yan
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 219
  • [8] Multi-step forecasting of multivariate time series using multi-attention collaborative network
    He, Xiaoyu
    Shi, Suixiang
    Geng, Xiulin
    Yu, Jie
    Xu, Lingyu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 211
  • [9] Attention-Based Models for Multivariate Time Series Forecasting: Multi-step Solar Irradiation Prediction
    Sakib, Sadman
    Mahadi, Mahin K.
    Abir, Samiur R.
    Moon, Al-Muzadded
    Shafiullah, Ahmad
    Ali, Sanjida
    Faisal, Fahim
    Nishat, Mirza M.
    [J]. HELIYON, 2024, 10 (06)
  • [10] Multi-step Learning Rule for Recurrent Neural Models: An Application to Time Series Forecasting
    Inés M. Galván
    Pedro Isasi
    [J]. Neural Processing Letters, 2001, 13 : 115 - 133