An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework

被引:0
|
作者
Hao, Jianlong [1 ]
Sun, Qiwei [1 ]
机构
[1] Shanxi Univ Finance & Econ, Sch Informat, Taiyuan 030006, Peoples R China
关键词
time series forecasting; Transformer; decoder-only; concept drift; low-rank decomposition;
D O I
10.3390/math13030490
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Time series forecasting constitutes a fundamental technique for analyzing dynamic alterations within temporal datasets and predicting future trends in various domains. Nevertheless, achieving effective modeling faces challenges arising from complex factors such as accurately capturing the relationships among temporally distant data points and accommodating rapid shifts in data distributions over time. While Transformer-based models have demonstrated remarkable capabilities in handling long-range dependencies recently, directly applying them to address the evolving distributions within temporal datasets remains a challenging task. To tackle these issues, this paper presents an innovative sequence-to-sequence adaptive learning approach centered on decoder framework for addressing temporal modeling tasks. An end-to-end deep learning architecture-based Transformer decoding framework is introduced, which is capable of adaptively discerning the interdependencies within temporal datasets. Experiments carried out on multiple datasets indicate that the time series adaptive learning model based on the decoder achieved an overall reduction of 2.6% in MSE (Mean Squared Error) loss and 1.8% in MAE (Mean Absolute Error) loss when compared with the most advanced Transformer-based time series forecasting model.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] AdaRNN: Adaptive Learning and Forecasting for Time Series
    Du, Yuntao
    Wang, Jindong
    Feng, Wenjie
    Pan, Sinno
    Qin, Tao
    Xu, Renjun
    Wang, Chongjun
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 402 - 411
  • [2] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    NEUROCOMPUTING, 2020, 388 (388) : 269 - 279
  • [3] A Bi-GRU-based encoder-decoder framework for multivariate time series forecasting
    Balti, Hanen
    Ben Abbes, Ali
    Farah, Imed Riadh
    SOFT COMPUTING, 2024, 28 (9-10) : 6775 - 6786
  • [4] Explainable Adaptive Tree -based Model Selection for Time -Series Forecasting
    Jakobs, Matthias
    Saadallah, Amal.
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 180 - 189
  • [5] A Novel Encoder-Decoder Model for Multivariate Time Series Forecasting
    Zhang, Huihui
    Li, Shicheng
    Chen, Yu
    Dai, Jiangyan
    Yi, Yugen
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [6] A novel transfer learning framework for time series forecasting
    Ye, Rui
    Dai, Qun
    KNOWLEDGE-BASED SYSTEMS, 2018, 156 : 74 - 99
  • [7] Dynamic adaptive encoder-decoder deep learning networks for multivariate time series forecasting of building energy consumption
    Guo, Jing
    Lin, Penghui
    Zhang, Limao
    Pan, Yue
    Xiao, Zhonghua
    APPLIED ENERGY, 2023, 350
  • [8] A GPU deep learning metaheuristic based model for time series forecasting
    Coelho, Igor M.
    Coelho, Vitor N.
    Luz, Eduardo J. da S.
    Ochi, Luiz S.
    Guimaraes, Frederico G.
    Rios, Eyder
    APPLIED ENERGY, 2017, 201 : 412 - 418
  • [9] Reinforcement Learning Based Dynamic Model Combination for Time Series Forecasting
    Fu, Yuwei
    Wu, Di
    Boulet, Benoit
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6639 - 6647
  • [10] Fuzzy time-series based on adaptive expectation model for TAIEX forecasting
    Cheng, Ching-Hsue
    Chen, Tai-Liang
    Teoh, Hia Jong
    Chiang, Chen-Han
    EXPERT SYSTEMS WITH APPLICATIONS, 2008, 34 (02) : 1126 - 1132