Dyformer: A dynamic transformer-based architecture for multivariate time series classification

被引:5
|
作者
Yang, Chao [1 ]
Wang, Xianzhi [1 ]
Yao, Lina [2 ,3 ]
Long, Guodong [4 ]
Xu, Guandong [1 ]
机构
[1] Univ Technol Sydney, Sch Comp Sci, Sydney, NSW 2007, Australia
[2] CSIRO Data61, Sydney, NSW 2015, Australia
[3] Univ New South Wales, Sch Comp Sci & Engn, Sydney, NSW 2052, Australia
[4] Univ Technol Sydney, Artificial Intelligence Inst, Sydney, NSW 2007, Australia
关键词
Multivariate time series classification; Data mining; Deep learning;
D O I
10.1016/j.ins.2023.119881
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multivariate time series classification is a crucial task with applications in broad areas such as finance, medicine, and engineering. Transformer is promising for time series classification, but as a generic approach, they have limited capability to effectively capture the distinctive characteristics inherent in time series data and adapt to diverse architectural requirements. This paper proposes a novel dynamic transformer-based architecture called Dyformer to address the above limitations of traditional transformers in multivariate time series classification. Dyformer incorporates hierarchical pooling to decompose time series into subsequences with different frequency components. Then, it employs Dyformer modules to achieve adaptive learning strategies for different frequency components based on a dynamic architecture. Furthermore, we introduce feature-map-wise attention mechanisms to capture multi-scale temporal dependencies and a joint loss function to facilitate model training. To evaluate the performance of Dyformer, we conducted extensive experiments using 30 benchmark datasets. The results unequivocally demonstrate that our model consistently outperforms a multitude of state-of-the-art methods and baseline approaches. Our model also copes well with limited training samples when pre-trained.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Transformer-based Bug/Feature Classification
    Ozturk, Ceyhun E.
    Yilmaz, Eyup Halit
    Koksal, Omer
    [J]. 2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [22] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    [J]. 2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [23] Classification Based on Compressive Multivariate Time Series
    Utomo, Chandra
    Li, Xue
    Wang, Sen
    [J]. DATABASES THEORY AND APPLICATIONS, (ADC 2016), 2016, 9877 : 204 - 214
  • [24] Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting
    Ouyang, Zuokun
    Jabloun, Meryem
    Ravier, Philippe
    [J]. 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 85 - 89
  • [25] A Transformer-Based Model for Time Series Prediction of Remote Sensing Data
    Niu, Xintian
    Liu, Yige Ng
    Ma, Ming
    [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 188 - 200
  • [26] Enhanced Linear and Vision Transformer-Based Architectures for Time Series Forecasting
    Alharthi, Musleh
    Mahmood, Ausif
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (05)
  • [27] Multivariate time series anomaly detection with adversarial transformer architecture in the Internet of Things
    Zeng, Fanyu
    Chen, Mengdong
    Qian, Cheng
    Wang, Yanyang
    Zhou, Yijun
    Tang, Wenzhong
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 144 : 244 - 255
  • [28] Multivariate time series classification with parametric derivative dynamic time warping
    Gorecki, Tomasz
    Luczak, Maciej
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (05) : 2305 - 2312
  • [29] A Transformer-based Neural Architecture Search Method
    Wang, Shang
    Tang, Huanrong
    Ouyang, Jianquan
    [J]. PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 691 - 694
  • [30] From anomaly detection to classification with graph attention and transformer for multivariate time series
    Wang, Chaoyang
    Liu, Guangyu
    [J]. ADVANCED ENGINEERING INFORMATICS, 2024, 60