A hybrid lightweight transformer architecture based on fuzzy attention prototypes for multivariate time series classification

被引:0
|
作者
Gu, Yan [1 ,2 ]
Jin, Feng [1 ,2 ]
Zhao, Jun [1 ,2 ]
Wang, Wei [1 ,2 ]
机构
[1] Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian,116024, China
[2] School of Control Science and Engineering, Dalian University of Technology, Dalian,116024, China
基金
中国国家自然科学基金;
关键词
Contrastive Learning;
D O I
10.1016/j.ins.2025.121942
中图分类号
学科分类号
摘要
Multivariate time series classification has become a research hotspot owing to its rapid development. Existing methods mainly focus on the feature correlations of time series, ignoring data uncertainty and sample sparsity. To address these challenges, a hybrid lightweight Transformer architecture based on fuzzy attention prototypes named FapFormer is proposed, in which a convolutional spanning Vision Transformer module is built to perform feature extraction and provide inductive bias, incorporating dynamic feature sampling to select the key features adaptively for increasing the training efficiency. A progressive branching convolution (PBC) block and convolutional self-attention (CSA) block are then introduced to extract both local and global features. Furthermore, a feature complementation strategy is implemented to enable the CSA block to specialize in global dependencies, overcoming the local receptive field limitations of the PBC block. Finally, a novel fuzzy attention prototype learning method is proposed to represent class prototypes for data uncertainty, which employs the distances between prototypes and low-dimensional embeddings for classification. Experiments were conducted using both the UEA benchmark dataset and a practical industrial dataset demonstrate that FapFormer outperforms several state-of-the-art methods, achieving improved accuracy and reduced computational complexity, even under conditions of data uncertainty and sample sparsity. © 2025 Elsevier Inc.
引用
下载
收藏
相关论文
共 50 条
  • [1] Dyformer: A dynamic transformer-based architecture for multivariate time series classification
    Yang, Chao
    Wang, Xianzhi
    Yao, Lina
    Long, Guodong
    Xu, Guandong
    INFORMATION SCIENCES, 2024, 656
  • [2] From anomaly detection to classification with graph attention and transformer for multivariate time series
    Wang, Chaoyang
    Liu, Guangyu
    ADVANCED ENGINEERING INFORMATICS, 2024, 60
  • [3] Spatial-temporal Attention Model Based on Transformer Architecture for Anomaly Detection in Multivariate Time Series Data
    Zeng, Lai
    Yang, Xiaomei
    Journal of Computers (Taiwan), 2024, 35 (03) : 193 - 207
  • [4] A hierarchical transformer-based network for multivariate time series classification
    Tang, Yingxia
    Wei, Yanxuan
    Li, Teng
    Zheng, Xiangwei
    Ji, Cun
    Information Systems, 2025, 132
  • [5] Multivariate Time Series Classification With An Attention-Based Multivariate Convolutional Neural Network
    Tripathi, Achyut Mani
    Baruah, Rashmi Dutta
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [6] Time Series Classification on Edge with Lightweight Attention Networks
    Mukhopadhyay, Shalini
    Dey, Swarnava
    Mukherjee, Arijit
    Pal, Arpan
    Ashwin, S.
    2024 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS AND OTHER AFFILIATED EVENTS, PERCOM WORKSHOPS, 2024, : 487 - 492
  • [7] An Aggregated Convolutional Transformer Based on Slices and Channels for Multivariate Time Series Classification
    Wu, Yupeng
    Lian, Cheng
    Zeng, Zhigang
    Xu, Bingrong
    Su, Yixin
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2023, 7 (03): : 768 - 779
  • [8] TransDBC: Transformer for Multivariate Time-Series based Driver Behavior Classification
    Vyas, Jayant
    Bhardwaj, Nishit
    Bhumika
    Das, Debasis
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [9] GRAformer: A gated residual attention transformer for multivariate time series forecasting
    Yang, Chengcao
    Wang, Yutian
    Yang, Bing
    Chen, Jun
    NEUROCOMPUTING, 2024, 581
  • [10] Multivariate time-series classification of sleep patterns using a hybrid deep learning architecture
    Hong, Jeonghan
    Yoon, Junho
    2017 IEEE 19TH INTERNATIONAL CONFERENCE ON E-HEALTH NETWORKING, APPLICATIONS AND SERVICES (HEALTHCOM), 2017,