A Two-Stage Attention-Based Hierarchical Transformer for Turbofan Engine Remaining Useful Life Prediction

被引:1
|
作者
Fan, Zhengyang [1 ]
Li, Wanru [1 ]
Chang, Kuo-Chu [1 ]
机构
[1] George Mason Univ, Dept Syst Engn & Operat Res, Fairfax, VA 22030 USA
关键词
two-stage attention; multiscale transformer; remaining useful life prediction; turbofan aircraft engine; PROGNOSTICS;
D O I
10.3390/s24030824
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The accurate estimation of the remaining useful life (RUL) for aircraft engines is essential for ensuring safety and uninterrupted operations in the aviation industry. Numerous investigations have leveraged the success of the attention-based Transformer architecture in sequence modeling tasks, particularly in its application to RUL prediction. These studies primarily focus on utilizing onboard sensor readings as input predictors. While various Transformer-based approaches have demonstrated improvement in RUL predictions, their exclusive focus on temporal attention within multivariate time series sensor readings, without considering sensor-wise attention, raises concerns about potential inaccuracies in RUL predictions. To address this concern, our paper proposes a novel solution in the form of a two-stage attention-based hierarchical Transformer (STAR) framework. This approach incorporates a two-stage attention mechanism, systematically addressing both temporal and sensor-wise attentions. Furthermore, we enhance the STAR RUL prediction framework by integrating hierarchical encoder-decoder structures to capture valuable information across different time scales. By conducting extensive numerical experiments with the CMAPSS datasets, we demonstrate that our proposed STAR framework significantly outperforms the current state-of-the-art models for RUL prediction.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Spatial correlation and temporal attention-based LSTM for remaining useful life prediction of turbofan engine
    Tian, Huixin
    Yang, Linzheng
    Ju, Bingtian
    [J]. MEASUREMENT, 2023, 214
  • [2] Spatial attention-based convolutional transformer for bearing remaining useful life prediction
    Chen, Chong
    Wang, Tao
    Liu, Ying
    Cheng, Lianglun
    Qin, Jian
    [J]. MEASUREMENT SCIENCE AND TECHNOLOGY, 2022, 33 (11)
  • [3] Prediction of remaining useful life of turbofan engine based on optimized model
    Liu, Yuefeng
    Zhang, Xiaoyan
    Guo, Wei
    Bian, Haodong
    He, Yingjie
    Liu, Zhen
    [J]. 2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1473 - 1477
  • [4] A Spatio-Temporal Attention Mechanism Based Approach for Remaining Useful Life Prediction of Turbofan Engine
    Peng, Cheng
    Wu, Jiaqi
    Tang, Zhaohui
    Yuan, Xinpan
    Li, Changyun
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [5] Remaining useful life prediction of turbofan engine based on Autoencoder-BLSTM
    Song, Ya
    Xia, Tangbin
    Zheng, Yu
    Zhuo, Pengcheng
    Pan, Ershun
    [J]. Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2019, 25 (07): : 1611 - 1619
  • [6] An Attention-Based Method for Remaining Useful Life Prediction of Rotating Machinery
    Deng, Yaohua
    Guo, Chengwang
    Zhang, Zilin
    Zou, Linfeng
    Liu, Xiali
    Lin, Shengyu
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (04):
  • [7] Remaining useful life prediction of turbofan engine based on similarity in multiple time scales
    Xu, Yu-Hui
    Shu, Jun-Qing
    Song, Ya
    Zheng, Yu
    Xia, Tang-Bin
    [J]. Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2021, 55 (10): : 1937 - 1947
  • [8] Distributed Attention-Based Temporal Convolutional Network for Remaining Useful Life Prediction
    Song, Yan
    Gao, Shengyao
    Li, Yibin
    Jia, Lei
    Li, Qiqiang
    Pang, Fuzhen
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (12): : 9594 - 9602
  • [9] Attention-based Gate Recurrent Unit for remaining useful life prediction in prognostics
    Lin, Ruiguan
    Wang, Huawei
    Xiong, Minglan
    Hou, Zhaoguo
    Che, Changchang
    [J]. APPLIED SOFT COMPUTING, 2023, 143
  • [10] Attention-based sequence to sequence model for machine remaining useful life prediction
    Ragab, Mohamed
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee-Keong
    Yan, Ruqiang
    Li, Xiaoli
    [J]. NEUROCOMPUTING, 2021, 466 : 58 - 68