AGCNT: Adaptive Graph Convolutional Network for Transformer-based Long Sequence Time-Series Forecasting

被引:2
|
作者
Su, Hongyang [1 ]
Wang, Xiaolong [1 ]
Qin, Yang [1 ]
机构
[1] Harbin Inst Technol, Shenzhen, Peoples R China
关键词
long sequence time-series forecasting; transformer; adaptive graph convolution; probsparse graph self-attention;
D O I
10.1145/3459637.3482054
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Long sequence time-series forecasting(LSTF) plays an important role in a variety of real-world application scenarios, such as electricity forecasting, weather forecasting, and traffic flow forecasting. It has previously been observed that transformer-based models have achieved outstanding results on LSTF tasks, which can reduce the complexity of the model and maintain stable prediction accuracy. Nevertheless, there are still some issues that limit the performance of transformer-based models for LSTF tasks: (i) the potential correlation between sequences is not considered; (ii) the inherent structure of encoder-decoder is difficult to expand after being optimized from the aspect of complexity. In order to solve these two problems, we propose a transformer-based model, named AGCNT, which is efficient and can capture the correlation between the sequences in the multivariate LSTF task without causing the memory bottleneck. Specifically, AGCNT has several characteristics: (i) a probsparse adaptive graph self-attention, which maps long sequences into a low-dimensional dense graph structure with an adaptive graph generation and captures the relationships between sequences with an adaptive graph convolution; (ii) the stacked encoder with distilling probsparse graph self-attention integrates the graph attention mechanism and retains the dominant attention of the cascade layer, which preserves the correlation between sparse queries from long sequences; (iii) the stacked decoder with generative inference generates all prediction values in one forward operation, which can improve the inference speed of long-term predictions. Experimental results on 4 large-scale datasets demonstrate the AGCNT outperforms state-of-the-art baselines.
引用
收藏
页码:3439 / 3442
页数:4
相关论文
共 50 条
  • [21] Forecasting chaotic time series: Comparative performance of LSTM-based and Transformer-based neural network
    Valle, Joao
    Bruno, Odemir Martinez
    CHAOS SOLITONS & FRACTALS, 2025, 192
  • [22] A systematic review for transformer-based long-term series forecasting
    Su, Liyilei
    Zuo, Xumin
    Li, Rui
    Wang, Xin
    Zhao, Heng
    Huang, Bingding
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (03)
  • [23] Long sequence time-series forecasting with deep learning: A survey
    Chen, Zonglei
    Ma, Minbo
    Li, Tianrui
    Wang, Hongjun
    Li, Chongshou
    INFORMATION FUSION, 2023, 97
  • [24] Expanding the prediction capacity in long sequence time-series forecasting
    Zhou, Haoyi
    Li, Jianxin
    Zhang, Shanghang
    Zhang, Shuai
    Yan, Mengyi
    Xiong, Hui
    ARTIFICIAL INTELLIGENCE, 2023, 318
  • [25] Representing Multiview Time-Series Graph Structures for Multivariate Long-Term Time-Series Forecasting
    Wang Z.
    Fan J.
    Wu H.
    Sun D.
    Wu J.
    IEEE Transactions on Artificial Intelligence, 5 (06): : 2651 - 2662
  • [26] Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting
    Ouyang, Zuokun
    Jabloun, Meryem
    Ravier, Philippe
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 85 - 89
  • [27] Transformer-Based Models for Probabilistic Time Series Forecasting with Explanatory Variables
    Caetano, Ricardo
    Oliveira, Jose Manuel
    Ramos, Patricia
    MATHEMATICS, 2025, 13 (05)
  • [28] Foreformer: an enhanced transformer-based framework for multivariate time series forecasting
    Ye Yang
    Jiangang Lu
    Applied Intelligence, 2023, 53 : 12521 - 12540
  • [29] Foreformer: an enhanced transformer-based framework for multivariate time series forecasting
    Yang, Ye
    Lu, Jiangang
    APPLIED INTELLIGENCE, 2023, 53 (10) : 12521 - 12540
  • [30] Enhanced Linear and Vision Transformer-Based Architectures for Time Series Forecasting
    Alharthi, Musleh
    Mahmood, Ausif
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (05)