DTAFORMER: Directional Time Attention Transformer For Long-Term Series Forecasting

被引:0
|
作者
Chang, Jiang [1 ]
Yue, Luhui [1 ]
Liu, Qingshan [2 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Nanjing 210044, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing 210023, Peoples R China
关键词
Time series forecasting; Directional time attention; Causal inference;
D O I
10.1007/978-981-97-8505-6_12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces the Directional Time Attention Transformer (DTAformer) model for long-term time series forecasting, addressing the inherent limitations of traditional Transformer-based models in capturing the sequential order. By establishing a causal graph, we identify the confounding relationships, which lead to the erroneous capture of spurious sequential temporal direction information in time series models. The Directional Time Attention, a key component of the model, leverages the front-door adjustment to eliminate the confounder from the causal relationship, ensuring accurate modeling of temporal direction in time series. Additionally, we further analyze the impact of different patching methods and loss functions on prediction performance. The model's performance is evaluated on nine benchmark datasets, with the results demonstrating its superiority over the State-of-the-Art methods.
引用
收藏
页码:162 / 180
页数:19
相关论文
共 50 条
  • [1] DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
    Ji Huang
    Minbo Ma
    Yongsheng Dai
    Jie Hu
    Shengdong Du
    Human-Centric Intelligent Systems, 2023, 3 (3): : 263 - 274
  • [2] SDformer: Transformer with Spectral Filter and Dynamic Attention for Multivariate Time Series Long-term Forecasting
    Zhou, Ziyu
    Lyu, Gengyu
    Huang, Yiming
    Wang, Zihao
    Jia, Ziyu
    Yang, Zhen
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 5689 - 5697
  • [3] MEAformer: An all-MLP transformer with temporal external attention for long-term time series forecasting
    Huang, Siyuan
    Liu, Yepeng
    Cui, Haoyi
    Zhang, Fan
    Li, Jinjiang
    Zhang, Xiaofeng
    Zhang, Mingli
    Zhang, Caiming
    INFORMATION SCIENCES, 2024, 669
  • [4] Long-term forecasting using transformer based on multiple time series
    Lee, Jaeyong
    Kim, Hyun Jun
    Lim, Changwon
    KOREAN JOURNAL OF APPLIED STATISTICS, 2024, 37 (05) : 583 - 598
  • [5] Robformer: A robust decomposition transformer for long-term time series forecasting
    Yu, Yang
    Ma, Ruizhe
    Ma, Zongmin
    PATTERN RECOGNITION, 2024, 153
  • [6] Hierarchical attention network for multivariate time series long-term forecasting
    Bi, Hongjing
    Lu, Lilei
    Meng, Yizhen
    APPLIED INTELLIGENCE, 2023, 53 (05) : 5060 - 5071
  • [7] Cross-Scale Attention for Long-Term Time Series Forecasting
    Wen, Liangjian
    Hu, Quan
    Guo, Cong
    Hu, Ao
    Zhang, Mingyi
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 2675 - 2679
  • [8] Hierarchical attention network for multivariate time series long-term forecasting
    Hongjing Bi
    Lilei Lu
    Yizhen Meng
    Applied Intelligence, 2023, 53 : 5060 - 5071
  • [9] PWDformer: Deformable transformer for long-term series forecasting
    Wang, Zheng
    Ran, Haowei
    Ren, Jinchang
    Sun, Meijun
    PATTERN RECOGNITION, 2024, 147
  • [10] CNformer: a convolutional transformer with decomposition for long-term multivariate time series forecasting
    Xingyu Wang
    Hui Liu
    Zhihan Yang
    Junzhao Du
    Xiyao Dong
    Applied Intelligence, 2023, 53 : 20191 - 20205