DLformer: A Dynamic Length Transformer-Based Network for Efficient Feature Representation in Remaining Useful Life Prediction

被引:25
|
作者
Ren, Lei [1 ,2 ]
Wang, Haiteng [1 ]
Huang, Gao [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[2] Zhongguancun Lab, Beijing 100094, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
Transformers; Feature extraction; Maintenance engineering; Time series analysis; Computational modeling; Adaptation models; Task analysis; Adaptive inference; deep learning; feature representation; interpretability; remaining useful life (RUL) prediction; PROGNOSTICS;
D O I
10.1109/TNNLS.2023.3257038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
引用
收藏
页码:5942 / 5952
页数:11
相关论文
共 50 条
  • [1] A new dual-channel transformer-based network for remaining useful life prediction
    Yang, Kai
    Wei, Yuxuan
    Ma, Yubao
    Huang, Lehong
    Tang, Qiang
    Li, Zhiguo
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (02)
  • [2] Transformer-based hierarchical latent space VAE for interpretable remaining useful life prediction
    Jing, Tao
    Zheng, Pai
    Xia, Liqiao
    Liu, Tianyuan
    ADVANCED ENGINEERING INFORMATICS, 2022, 54
  • [3] Remaining Useful Life Prediction of Lithium-Ion Batteries by Using a Denoising Transformer-Based Neural Network
    Han, Yunlong
    Li, Conghui
    Zheng, Linfeng
    Lei, Gang
    Li, Li
    ENERGIES, 2023, 16 (17)
  • [4] Transformer-based novel framework for remaining useful life prediction of lubricant in operational rolling bearings
    Kim, Sunghyun
    Seo, Yun-Ho
    Park, Junhong
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 251
  • [5] TransRUL: A Transformer-Based Multihead Attention Model for Enhanced Prediction of Battery Remaining Useful Life
    Saleem, Umar
    Liu, Wenjie
    Riaz, Saleem
    Li, Weilin
    Hussain, Ghulam Amjad
    Rashid, Zeeshan
    Arfeen, Zeeshan Ahmad
    ENERGIES, 2024, 17 (16)
  • [6] A Dual-Scale Transformer-Based Remaining Useful Life Prediction Model in Industrial Internet of Things
    Li, Junhuai
    Wang, Kan
    Hou, Xiangwang
    Lan, Dapeng
    Wu, Yunwen
    Wang, Huaijun
    Liu, Lei
    Mumtaz, Shahid
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16): : 26656 - 26667
  • [7] Remaining Useful Life Prediction Based on Improved Transformer Network for Pneumatic Control Valve
    Cheng, Yu
    Chen, JianLiang
    2024 3RD CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, FASTA 2024, 2024, : 638 - 643
  • [8] Dual Siamese transformer-encoder-based network for remaining useful life prediction
    Lin, Ching-Sheng
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (17): : 25424 - 25449
  • [9] Remaining useful life prediction of rolling bearings based on time convolutional network and transformer in parallel
    Tang, Youfu
    Liu, Ruifeng
    Li, Chunhui
    Lei, Na
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (12)
  • [10] A novel vision transformer network for rolling bearing remaining useful life prediction
    Hu, Aijun
    Zhu, Yancheng
    Liu, Suixian
    Xing, Lei
    Xiang, Ling
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (02)