GTformer: Graph-Based Temporal-Order-Aware Transformer for Long-Term Series Forecasting

被引:2
|
作者
Liang, Aobo [1 ]
Chai, Xiaolin [1 ]
Sun, Yan [1 ]
Guizani, Mohsen [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Natl Pilot Software Engn Sch, Beijing 100876, Peoples R China
[2] Mohamed Bin Zayed Univ Artificial Intelligence, Machine Learning Dept, Abu Dhabi, U Arab Emirates
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 19期
基金
中国国家自然科学基金;
关键词
Time series analysis; Transformers; Predictive models; Forecasting; Data models; Task analysis; Internet of Things; Interseries dependencies; long-term time series forecasting; multivariate time series (MTS); strict temporal order; transformer; INTERNET; NETWORK;
D O I
10.1109/JIOT.2024.3419768
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the production environment of the Internet of Things (IoT), sensors of various qualities generate a large amount of multivariate time series (MTS) data. The long-term prediction of time series data generated by various IoT devices provides longer foresight and helps execute necessary resource scheduling or fault alarms in advance, thus improving the efficiency of system operation and ensuring system security. In recent years, deep learning models like Transformers have achieved advanced performance in multivariate long-term time series forecasting (MLTSF) tasks. However, many previous research attempts either overlooked the interseries dependencies or ignored the need to model the strict temporal order of MTS data. In this article, we introduce GTformer, a graph-based temporal-order-aware transformer model. We propose an adaptive graph learning method specifically designed for MTS data to capture both uni-directional and bi-directional relations. In addition, we generate positional encoding in a sequential way to emphasize the strict temporal order of time series. By adopting these two components, our model can have a better understanding of the interseries and intraseries dependencies of MTS data. We conducted extensive experiments on eight real-world data sets, and the results show that our model achieves better predictions compared with state-of-the-art methods.
引用
收藏
页码:31467 / 31478
页数:12
相关论文
共 50 条
  • [31] Temporal patterns decomposition and Legendre projection for long-term time series forecasting
    Liu, Jianxin
    Ma, Tinghuai
    Su, Yuming
    Rong, Huan
    Khalil, Alaa Abd El-Raouf Mohamed
    Wahab, Mohamed Magdy Abdel
    Osibo, Benjamin Kwapong
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (16): : 23407 - 23441
  • [32] Temporal Chain Network With Intuitive Attention Mechanism for Long-Term Series Forecasting
    Zhang, Zhen
    Han, Yongming
    Ma, Bo
    Liu, Min
    Geng, Zhiqiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [33] Data Segmentation based Long-term Time Series Forecasting
    Bao, Yizhen
    Lu, Shiyu
    2024 6TH INTERNATIONAL CONFERENCE ON DATA-DRIVEN OPTIMIZATION OF COMPLEX SYSTEMS, DOCS 2024, 2024, : 51 - 58
  • [34] TVC Former: A transformer-based long-term multivariate time series forecasting method using time-variable coupling correlation graph
    Liu, Zhenyu
    Feng, Yuan
    Liu, Hui
    Tang, Ruining
    Yang, Bo
    Zhang, Donghao
    Jia, Weiqiang
    Tan, Jianrong
    KNOWLEDGE-BASED SYSTEMS, 2025, 314
  • [35] CALTM: A Context-Aware Long-Term Time-Series Forecasting Model
    Jin, Canghong
    Chen, Jiapeng
    Wu, Shuyu
    Wu, Hao
    Wang, Shuoping
    Ying, Jing
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 139 (01): : 873 - 891
  • [36] Long-term Occupancy Analysis using Graph-Based Optimisation in Thermal Imagery
    Gade, Rikke
    Jorgensen, Anders
    Moeslund, Thomas B.
    2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 3698 - 3705
  • [37] GLAD: Graph-Based Long-Term Attentive Dynamic Memory for Sequential Recommendation
    Pandey, Deepanshu
    Sarkar, Arindam
    Comar, Prakash Mandayam
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT III, 2024, 14610 : 72 - 88
  • [38] Resformer: Combine quadratic linear transformation with efficient sparse Transformer for long-term series forecasting
    Chen, Gongguan
    Wang, Hua
    Liu, Yepeng
    Zhang, Mingli
    Zhang, Fan
    INTELLIGENT DATA ANALYSIS, 2023, 27 (06) : 1557 - 1572
  • [39] SDformer: Transformer with Spectral Filter and Dynamic Attention for Multivariate Time Series Long-term Forecasting
    Zhou, Ziyu
    Lyu, Gengyu
    Huang, Yiming
    Wang, Zihao
    Jia, Ziyu
    Yang, Zhen
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 5689 - 5697
  • [40] Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting
    Li, Ao
    Li, Ying
    Xu, Yunyang
    Li, Xuemei
    Zhang, Caiming
    NEURAL NETWORKS, 2024, 180