Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks

被引:0
|
作者
Ma'sum M.A. [1 ]
Sarkar R. [2 ]
Pratama M. [1 ]
Ramasamy S. [3 ]
Anavatti S. [2 ]
Liu L. [1 ]
Habibullah [1 ]
Kowalczyk R. [1 ]
机构
[1] STEM, University of South Australia, Adelaide, SA
[2] SEIT, University of New South Australia, Canberra
[3] I2R, A*Star
来源
IEEE Transactions on Artificial Intelligence | 2024年 / 5卷 / 08期
关键词
Artificial intelligence; Australia; concept drifts; Deep learning; deep learning; Forecasting; Self-supervised learning; Task analysis; time-series forecasting; Transformers; transformers;
D O I
10.1109/TAI.2024.3365775
中图分类号
学科分类号
摘要
A reliable long-term time-series forecaster is highly demanded in practice but comes across many challenges such as low computational and memory footprints as well as robustness against dynamic learning environments. This paper proposes Meta-Transformer Networks (MANTRA) to deal with the dynamic long-term time-series forecasting tasks. MANTRA relies on the concept of fast and slow learners where a collection of fast learners learns different aspects of data distributions while adapting quickly to changes. A slow learner tailors suitable representations to fast learners. Fast adaptations to dynamic environments are achieved using the universal representation transformer layers producing task-adapted representations with a small number of parameters. Our experiments using four datasets with different prediction lengths demonstrate the advantage of our approach with at least 3% improvements over the baseline algorithms for both multivariate and univariate settings. Source codes of MANTRA are publicly available in <uri>https://github.com/anwarmaxsum/MANTRA</uri>. IEEE
引用
收藏
页码:1 / 11
页数:10
相关论文
共 50 条
  • [1] Multi-resolution Time-Series Transformer for Long-term Forecasting
    Zhang, Yitian
    Ma, Liheng
    Pal, Soumyasundar
    Zhang, Yingxue
    Coates, Mark
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [2] Gaussian Process for Long-Term Time-Series Forecasting
    Yan, Weizhong
    Qiu, Hai
    Xue, Ya
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1031 - 1038
  • [3] PETformer: Long-Term Time Series Forecasting via Placeholder-Enhanced Transformer
    Lin, Shengsheng
    Lin, Weiwei
    Wu, Wentai
    Wang, Songbo
    Wang, Yongxiang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [4] Representing Multiview Time-Series Graph Structures for Multivariate Long-Term Time-Series Forecasting
    Wang Z.
    Fan J.
    Wu H.
    Sun D.
    Wu J.
    IEEE Transactions on Artificial Intelligence, 5 (06): : 2651 - 2662
  • [5] SDformer: Transformer with Spectral Filter and Dynamic Attention for Multivariate Time Series Long-term Forecasting
    Zhou, Ziyu
    Lyu, Gengyu
    Huang, Yiming
    Wang, Zihao
    Jia, Ziyu
    Yang, Zhen
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 5689 - 5697
  • [6] DTAFORMER: Directional Time Attention Transformer For Long-Term Series Forecasting
    Chang, Jiang
    Yue, Luhui
    Liu, Qingshan
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT IV, 2025, 15034 : 162 - 180
  • [7] Long-term forecasting using transformer based on multiple time series
    Lee, Jaeyong
    Kim, Hyun Jun
    Lim, Changwon
    KOREAN JOURNAL OF APPLIED STATISTICS, 2024, 37 (05) : 583 - 598
  • [8] Robformer: A robust decomposition transformer for long-term time series forecasting
    Yu, Yang
    Ma, Ruizhe
    Ma, Zongmin
    PATTERN RECOGNITION, 2024, 153
  • [9] SageFormer: Series-Aware Framework for Long-Term Multivariate Time-Series Forecasting
    Zhang, Zhenwei
    Meng, Linghang
    Gu, Yuantao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 18435 - 18448
  • [10] PWDformer: Deformable transformer for long-term series forecasting
    Wang, Zheng
    Ran, Haowei
    Ren, Jinchang
    Sun, Meijun
    PATTERN RECOGNITION, 2024, 147