GTformer: Graph-Based Temporal-Order-Aware Transformer for Long-Term Series Forecasting

被引:2
|
作者
Liang, Aobo [1 ]
Chai, Xiaolin [1 ]
Sun, Yan [1 ]
Guizani, Mohsen [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Natl Pilot Software Engn Sch, Beijing 100876, Peoples R China
[2] Mohamed Bin Zayed Univ Artificial Intelligence, Machine Learning Dept, Abu Dhabi, U Arab Emirates
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 19期
基金
中国国家自然科学基金;
关键词
Time series analysis; Transformers; Predictive models; Forecasting; Data models; Task analysis; Internet of Things; Interseries dependencies; long-term time series forecasting; multivariate time series (MTS); strict temporal order; transformer; INTERNET; NETWORK;
D O I
10.1109/JIOT.2024.3419768
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the production environment of the Internet of Things (IoT), sensors of various qualities generate a large amount of multivariate time series (MTS) data. The long-term prediction of time series data generated by various IoT devices provides longer foresight and helps execute necessary resource scheduling or fault alarms in advance, thus improving the efficiency of system operation and ensuring system security. In recent years, deep learning models like Transformers have achieved advanced performance in multivariate long-term time series forecasting (MLTSF) tasks. However, many previous research attempts either overlooked the interseries dependencies or ignored the need to model the strict temporal order of MTS data. In this article, we introduce GTformer, a graph-based temporal-order-aware transformer model. We propose an adaptive graph learning method specifically designed for MTS data to capture both uni-directional and bi-directional relations. In addition, we generate positional encoding in a sequential way to emphasize the strict temporal order of time series. By adopting these two components, our model can have a better understanding of the interseries and intraseries dependencies of MTS data. We conducted extensive experiments on eight real-world data sets, and the results show that our model achieves better predictions compared with state-of-the-art methods.
引用
收藏
页码:31467 / 31478
页数:12
相关论文
共 50 条
  • [21] CNformer: a convolutional transformer with decomposition for long-term multivariate time series forecasting
    Xingyu Wang
    Hui Liu
    Zhihan Yang
    Junzhao Du
    Xiyao Dong
    Applied Intelligence, 2023, 53 : 20191 - 20205
  • [22] A decoupled network with variable graph convolution and temporal external attention for long-term multivariate time series forecasting
    Liu, Yepeng
    Huang, Zhigen
    Zhang, Fan
    Zhang, Xiaofeng
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 271
  • [23] CNformer: a convolutional transformer with decomposition for long-term multivariate time series forecasting
    Wang, Xingyu
    Liu, Hui
    Yang, Zhihan
    Du, Junzhao
    Dong, Xiyao
    APPLIED INTELLIGENCE, 2023, 53 (17) : 20191 - 20205
  • [24] Multi-resolution Time-Series Transformer for Long-term Forecasting
    Zhang, Yitian
    Ma, Liheng
    Pal, Soumyasundar
    Zhang, Yingxue
    Coates, Mark
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [25] SageFormer: Series-Aware Framework for Long-Term Multivariate Time-Series Forecasting
    Zhang, Zhenwei
    Meng, Linghang
    Gu, Yuantao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 18435 - 18448
  • [26] STPSformer: Spatial-Temporal ProbSparse Transformer for Long-Term Traffic Flow Forecasting
    Wang, Zhanquan (zhqwang@ecust.edu.cn), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [27] MSDformer: an autocorrelation transformer with multiscale decomposition for long-term multivariate time series forecasting
    Su, Guangyao
    Guan, Yepeng
    APPLIED INTELLIGENCE, 2025, 55 (02)
  • [28] Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks
    Ma'sum M.A.
    Sarkar R.
    Pratama M.
    Ramasamy S.
    Anavatti S.
    Liu L.
    Habibullah
    Kowalczyk R.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (08): : 1 - 11
  • [29] PETformer: Long-Term Time Series Forecasting via Placeholder-Enhanced Transformer
    Lin, Shengsheng
    Lin, Weiwei
    Wu, Wentai
    Wang, Songbo
    Wang, Yongxiang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [30] DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
    Ji Huang
    Minbo Ma
    Yongsheng Dai
    Jie Hu
    Shengdong Du
    Human-Centric Intelligent Systems, 2023, 3 (3): : 263 - 274