DLformer: A Dynamic Length Transformer-Based Network for Efficient Feature Representation in Remaining Useful Life Prediction

被引:25
|
作者
Ren, Lei [1 ,2 ]
Wang, Haiteng [1 ]
Huang, Gao [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[2] Zhongguancun Lab, Beijing 100094, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
Transformers; Feature extraction; Maintenance engineering; Time series analysis; Computational modeling; Adaptation models; Task analysis; Adaptive inference; deep learning; feature representation; interpretability; remaining useful life (RUL) prediction; PROGNOSTICS;
D O I
10.1109/TNNLS.2023.3257038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
引用
收藏
页码:5942 / 5952
页数:11
相关论文
共 50 条
  • [31] Compact Convolutional Transformer for Bearing Remaining Useful Life Prediction
    Jin, Zhongtian
    Chen, Chong
    Liu, Qingtao
    Syntetos, Aris
    Liu, Ying
    ADVANCES IN REMANUFACTURING, IWAR 2023, 2024, : 227 - 238
  • [32] Conditional variational transformer for bearing remaining useful life prediction
    Wei, Yupeng
    Wu, Dazhong
    ADVANCED ENGINEERING INFORMATICS, 2024, 59
  • [33] A hybrid network capturing multisource feature correlations for tool remaining useful life prediction
    Wu, Shihao
    Li, Yang
    Li, Weiguang
    Zhao, Xuezhi
    Luo, Chunlin
    Yu, Qiulin
    Lin, Shoujin
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2023, 125 (5-6): : 2815 - 2831
  • [34] Remaining useful life prediction of lithium-ion batteries based on wavelet denoising and transformer neural network
    Hu, Wangyang
    Zhao, Shaishai
    FRONTIERS IN ENERGY RESEARCH, 2022, 10
  • [35] Remaining Useful Life Prediction Method Based on Dual-Path Interaction Network with Multiscale Feature Fusion and Dynamic Weight Adaptation
    Lu, Zhe
    Li, Bing
    Fu, Changyu
    Wu, Junbao
    Xu, Liang
    Jia, Siye
    Zhang, Hao
    ACTUATORS, 2024, 13 (10)
  • [36] A hybrid network capturing multisource feature correlations for tool remaining useful life prediction
    Shihao Wu
    Yang Li
    Weiguang Li
    Xuezhi Zhao
    Chunlin Luo
    Qiulin Yu
    Shoujin Lin
    The International Journal of Advanced Manufacturing Technology, 2023, 125 : 2815 - 2831
  • [37] Learning continuous dynamic network representation with transformer-based temporal graph neural network
    Li, Yingji
    Wu, Yue
    Sun, Mingchen
    Yang, Bo
    Wang, Ying
    INFORMATION SCIENCES, 2023, 649
  • [38] Remaining Useful Life Prediction Based on Multi-Representation Domain Adaptation
    Lyu, Yi
    Zhang, Qichen
    Wen, Zhenfei
    Chen, Aiguo
    MATHEMATICS, 2022, 10 (24)
  • [39] Remaining useful life prediction based on intentional noise injection and feature reconstruction
    Xiao, Lei
    Tang, Junxuan
    Zhang, Xinghui
    Bechhoefer, Eric
    Ding, Siyi
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2021, 215
  • [40] Remaining useful life prediction of rolling bearings based on parallel feature extraction
    Li, Chao
    Zhai, Weimin
    Fu, Weiming
    Qin, Jiahu
    Kang, Yu
    ROBOTIC INTELLIGENCE AND AUTOMATION, 2025, 45 (01): : 90 - 105