Adversarial self-attentive time-variant neural networks for multi-step time series forecasting

被引:6
|
作者
Gao, Changxia [1 ]
Zhang, Ning [1 ]
Li, Youru [2 ]
Lin, Yan [3 ]
Wan, Huaiyu [3 ]
机构
[1] Beijing Jiaotong Univ, China Engn Res Ctr, Sch Comp & Informat Technol, Network Management Technol High Speed Railway MOE, Beijing 100044, Peoples R China
[2] Beijing Jiaotong Univ, Sch Comp & Informat Technol, China & Beijing Key Lab Adv Informat Sci & Network, Beijing 100044, Peoples R China
[3] Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing Key Lab Traff Data Anal & Min, Beijing 100044, Peoples R China
关键词
Time series forecasting; Dynamic modeling; Short-term correlations; Long-term forecasting;
D O I
10.1016/j.eswa.2023.120722
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate forecasting of time series mitigates the uncertainty of future outlooks and is a great help in reducing errors in decisions. Despite years of researches, there are still some challenges to accurate forecasting of time series, including the difficulty of dynamic modeling, the problem of capturing short-term correlations, and the conundrum of long-term forecasting. This paper offers an Adversarial Truncated Cauchy Self-Attentive Time Variant Neural Network (ASATVN) for multi-step ahead time series forecasting. Specifically, the proposed model builds on Generative Adversarial Networks, in which the generator is composed of a novel time-variant model. The time-variant model contributes to learning dynamic time-series changes with its time-variant architecture and employs a newly proposed Truncated Cauchy Self-Attention block to capture the local sequential dependencies better. For the discriminator, two self-attentive discriminators are presented to regularize predictions with fidelity and continuity, which is beneficial to predicting sequence over longer time horizons. Our proposed ASATVN model outperforms the state-of-the-art predictive models on eleven real-world benchmark datasets, demonstrating its effectiveness.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Bian, Feng
    Wan, Huaiyu
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8737 - 8754
  • [2] Self-attention-based time-variant neural networks for multi-step time series forecasting
    Changxia Gao
    Ning Zhang
    Youru Li
    Feng Bian
    Huaiyu Wan
    [J]. Neural Computing and Applications, 2022, 34 : 8737 - 8754
  • [3] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao Changxia
    Zhang Ning
    Li Youru
    Lin Yan
    Wan Huaiyu
    [J]. Applied Intelligence, 2023, 53 : 28974 - 28993
  • [4] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    [J]. APPLIED INTELLIGENCE, 2023, 53 (23) : 28974 - 28993
  • [5] Robustness of LSTM neural networks for multi-step forecasting of chaotic time series
    Sangiorgio, Matteo
    Dercole, Fabio
    [J]. CHAOS SOLITONS & FRACTALS, 2020, 139
  • [6] A multi-step forecasting method of time series
    Zhou, JB
    Wang, YK
    Yang, GY
    Zhao, YL
    [J]. 13TH CONFERENCE ON PROBABILITY AND STATISTICS IN THE ATMOSPHERIC SCIENCES, 1996, : 361 - 362
  • [7] Self-Attentive Moving Average for Time Series Prediction
    Su, Yaxi
    Cui, Chaoran
    Qu, Hao
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [8] Forecasting Wavelet Transformed Time Series with Attentive Neural Networks
    Zhao, Yi
    Shen, Yanyan
    Zhu, Yanmin
    Yao, Junjie
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 1452 - 1457
  • [9] Multi-step Learning Rule for Recurrent Neural Models: An Application to Time Series Forecasting
    Inés M. Galván
    Pedro Isasi
    [J]. Neural Processing Letters, 2001, 13 : 115 - 133
  • [10] Multi-step learning rule for recurrent neural models:: An application to time series forecasting
    Galván, IM
    Isasi, P
    [J]. NEURAL PROCESSING LETTERS, 2001, 13 (02) : 115 - 133