Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks

被引:0
|
作者
Ma'sum M.A. [1 ]
Sarkar R. [2 ]
Pratama M. [1 ]
Ramasamy S. [3 ]
Anavatti S. [2 ]
Liu L. [1 ]
Habibullah [1 ]
Kowalczyk R. [1 ]
机构
[1] STEM, University of South Australia, Adelaide, SA
[2] SEIT, University of New South Australia, Canberra
[3] I2R, A*Star
来源
关键词
Artificial intelligence; Australia; concept drifts; Deep learning; deep learning; Forecasting; Self-supervised learning; Task analysis; time-series forecasting; Transformers; transformers;
D O I
10.1109/TAI.2024.3365775
中图分类号
学科分类号
摘要
A reliable long-term time-series forecaster is highly demanded in practice but comes across many challenges such as low computational and memory footprints as well as robustness against dynamic learning environments. This paper proposes Meta-Transformer Networks (MANTRA) to deal with the dynamic long-term time-series forecasting tasks. MANTRA relies on the concept of fast and slow learners where a collection of fast learners learns different aspects of data distributions while adapting quickly to changes. A slow learner tailors suitable representations to fast learners. Fast adaptations to dynamic environments are achieved using the universal representation transformer layers producing task-adapted representations with a small number of parameters. Our experiments using four datasets with different prediction lengths demonstrate the advantage of our approach with at least 3% improvements over the baseline algorithms for both multivariate and univariate settings. Source codes of MANTRA are publicly available in <uri>https://github.com/anwarmaxsum/MANTRA</uri>. IEEE
引用
下载
收藏
页码:1 / 11
页数:10
相关论文
共 50 条
  • [1] Gaussian Process for Long-Term Time-Series Forecasting
    Yan, Weizhong
    Qiu, Hai
    Xue, Ya
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1031 - 1038
  • [2] Representing Multiview Time-Series Graph Structures for Multivariate Long-Term Time-Series Forecasting
    Wang Z.
    Fan J.
    Wu H.
    Sun D.
    Wu J.
    IEEE Transactions on Artificial Intelligence, 5 (06): : 2651 - 2662
  • [3] Robformer: A robust decomposition transformer for long-term time series forecasting
    Yu, Yang
    Ma, Ruizhe
    Ma, Zongmin
    PATTERN RECOGNITION, 2024, 153
  • [4] SageFormer: Series-Aware Framework for Long-Term Multivariate Time-Series Forecasting
    Zhang, Zhenwei
    Meng, Linghang
    Gu, Yuantao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 18435 - 18448
  • [5] PWDformer: Deformable transformer for long-term series forecasting
    Wang, Zheng
    Ran, Haowei
    Ren, Jinchang
    Sun, Meijun
    PATTERN RECOGNITION, 2024, 147
  • [6] CNformer: a convolutional transformer with decomposition for long-term multivariate time series forecasting
    Xingyu Wang
    Hui Liu
    Zhihan Yang
    Junzhao Du
    Xiyao Dong
    Applied Intelligence, 2023, 53 : 20191 - 20205
  • [7] CNformer: a convolutional transformer with decomposition for long-term multivariate time series forecasting
    Wang, Xingyu
    Liu, Hui
    Yang, Zhihan
    Du, Junzhao
    Dong, Xiyao
    APPLIED INTELLIGENCE, 2023, 53 (17) : 20191 - 20205
  • [8] Long-term forecasting of multivariate time series in industrial furnaces with dynamic Gaussian Bayesian networks
    Quesada, David
    Valverde, Gabriel
    Larranaga, Pedro
    Bielza, Concha
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2021, 103
  • [9] CALTM: A Context-Aware Long-Term Time-Series Forecasting Model
    Jin, Canghong
    Chen, Jiapeng
    Wu, Shuyu
    Wu, Hao
    Wang, Shuoping
    Ying, Jing
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 139 (01): : 873 - 891
  • [10] Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
    Zhou, Haoyi
    Zhang, Shanghang
    Peng, Jieqi
    Zhang, Shuai
    Li, Jianxin
    Xiong, Hui
    Zhang, Wancai
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 11106 - 11115