Foreformer: an enhanced transformer-based framework for multivariate time series forecasting

被引:14
|
作者
Yang, Ye [1 ]
Lu, Jiangang [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Control Sci & Engn, State Key Lab Ind Control Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Lab, Hangzhou 311121, Peoples R China
关键词
Multivariate time series forecasting; Attention mechanism; Deep learning; Multi-resolution; Static covariate; Transformer; CONVOLUTIONAL NETWORKS;
D O I
10.1007/s10489-022-04100-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multivariate time series forecasting (MTSF) has been extensively studied throughout years with ubiquitous applications in finance, traffic, environment, etc. Recent investigations have demonstrated the potential of Transformer to improve the forecasting performance. Transformer, however, has limitations that prohibit it from being directly applied to MTSF, such as insufficient extraction of temporal patterns at different time scales, extraction of irrelevant information in the self-attention, and no targeted processing of static covariates. Motivated by above, an enhanced Transformer-based framework for MTSF is proposed, named Foreformer, with three distinctive characteristics: (i) a multi-temporal resolution module that deeply captures temporal patterns at different scales, (ii) an explicit sparse attention mechanism forces model to prioritize the most contributive components, and (iii) a static covariates processing module for nonlinear processing of static covariates. Extensive experiments on three real-world datasets demonstrate that Foreformer outperforms existing methodologies, making it a reliable approach for MTSF tasks.
引用
收藏
页码:12521 / 12540
页数:20
相关论文
共 50 条
  • [41] How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer
    Feng, Xuande
    Lyu, Zonglin
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 967 - 975
  • [42] AGCNT: Adaptive Graph Convolutional Network for Transformer-based Long Sequence Time-Series Forecasting
    Su, Hongyang
    Wang, Xiaolong
    Qin, Yang
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3439 - 3442
  • [43] Memory-based Transformer with shorter window and longer horizon for multivariate time series forecasting
    Liu, Yang
    Wang, Zheng
    Yu, Xinyang
    Chen, Xin
    Sun, Meijun
    PATTERN RECOGNITION LETTERS, 2022, 160 : 26 - 33
  • [44] TraM: Enhancing User Sleep Prediction with Transformer-based Multivariate Time Series Modeling and Machine Learning Ensembles
    Kim, Jinjae
    Ma, Minjeong
    Choi, Eunjee
    Cho, Keunhee
    Lee, Changwoo
    arXiv,
  • [45] Forecasting multivariate time series
    Athanasopoulos, George
    Vahid, Farshid
    INTERNATIONAL JOURNAL OF FORECASTING, 2015, 31 (03) : 680 - 681
  • [46] Multi-Scale Transformer Pyramid Networks for Multivariate Time Series Forecasting
    Zhang, Yifan
    Wu, Rui
    Dascalu, Sergiu M.
    Harris, Frederick C.
    IEEE Access, 2024, 12 : 14731 - 14741
  • [47] Heterogeneous Graph Transformer Auto-Encoder for multivariate time series forecasting
    Ye, Hongjiang
    Sun, Ying
    Gao, Yu
    Xu, Feiyi
    Qi, Jin
    COMPUTERS & ELECTRICAL ENGINEERING, 2025, 122
  • [48] Spatial-Temporal Convolutional Transformer Network for Multivariate Time Series Forecasting
    Huang, Lei
    Mao, Feng
    Zhang, Kai
    Li, Zhiheng
    SENSORS, 2022, 22 (03)
  • [49] Multi-Scale Transformer Pyramid Networks for Multivariate Time Series Forecasting
    Zhang, Yifan
    Wu, Rui
    Dascalu, Sergiu M.
    Harris, Frederick C.
    IEEE ACCESS, 2024, 12 : 14731 - 14741
  • [50] Considering Nonstationary within Multivariate Time Series with Variational Hierarchical Transformer for Forecasting
    Wang, Muyao
    Chen, Wenchao
    Chen, Bo
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 15563 - 15570