Remaining useful life prediction for multi-sensor mechanical equipment based on self-attention mechanism network incorporating spatio-temporal convolution

被引:0
|
作者
Yang, Xu [1 ,2 ]
Tang, Lin [1 ]
Huang, Jian [1 ,2 ]
机构
[1] Univ Sci & Technol Beijing, Sch Automat & Elect Engn, Key Lab Knowledge Automat Ind Proc, Minist Educ, 30 Xueyuan Rd, Beijing 100083, Peoples R China
[2] Univ Sci & Technol Beijing, Shunde Innovat Sch, Beijing, Peoples R China
关键词
Multi-sensor mechanical equipment; self-attention mechanism network; remaining useful life; graph convolutional network; dilated convolutional network; NEURAL-NETWORKS;
D O I
10.1177/09596518241269642
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Driven by the limitations of spatial feature extraction in graph learning methods of multi-sensor mechanism equipment, this paper proposes a spatio-temporal self-attention mechanism network (STCAN) that integrates spatial relationships and time series information to predict the remaining useful life (RUL). Firstly, a graph convolutional network (GCN) is applied to extract the spatial correlation characteristics and fused with the self-attention mechanism network to obtain the global and local spatial features. Subsequently, a dilated convolutional network (DCN) is integrated into the self-attention mechanism network, to extract the global and multi-step temporal features and mitigate long-term dependency issues. Finally, the extracted spatio-temporal features are used to predict the equipment's RUL through fully connected layers. The experimental results demonstrate that STCAN outperforms some existing methods in terms of RUL prediction.
引用
下载
收藏
页数:15
相关论文
共 50 条
  • [31] Remaining useful life prediction for lithium-ion batteries incorporating spatio-temporal information
    Lv, Zihao
    Song, Yi
    He, Chunlin
    Xu, Liming
    JOURNAL OF ENERGY STORAGE, 2024, 88
  • [32] Remaining Useful Life Prediction Via Interactive Attention-Based Deep Spatio-Temporal Network Fusing Multisource Information
    Lu, Shixiang
    Gao, Zhiwei
    Xu, Qifa
    Jiang, Cuixia
    Xie, Tianming
    Zhang, Aihua
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (07) : 8007 - 8016
  • [33] Temporal Convolution-Based Long-Short Term Memory Network With Attention Mechanism for Remaining Useful Life Prediction
    Hsu, Chia-Yu
    Lu, Yi-Wei
    Yan, Jia-Hong
    IEEE TRANSACTIONS ON SEMICONDUCTOR MANUFACTURING, 2022, 35 (02) : 220 - 228
  • [34] Temporal Convolutional Network with Attention Mechanism for Bearing Remaining Useful Life Prediction
    Wang, Shuai
    Zhang, Chao
    Lv, Da
    Zhao, Wentao
    PROCEEDINGS OF TEPEN 2022, 2023, 129 : 391 - 400
  • [35] A novel spatio-temporal characteristic extraction network for bearing remaining useful life prediction
    Jiang, Li
    Cao, Biaobiao
    Zhang, Xin
    Chen, Bingyang
    Wang, Lei
    Li, Yibing
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (11)
  • [36] A novel multi-scale CNN and attention mechanism method with multi-sensor signal for remaining useful life prediction
    Xu, Xingwei
    Li, Xiang
    Ming, Weiwei
    Chen, Ming
    COMPUTERS & INDUSTRIAL ENGINEERING, 2022, 169
  • [37] Remaining useful life prediction based on graph feature attention networks with missing multi-sensor features
    Wang, Yu
    Peng, Shangjing
    Wang, Hong
    Zhang, Mingquan
    Cao, Hongrui
    Ma, Liwei
    Reliability Engineering and System Safety, 2025, 258
  • [38] Self-Attention and Multi-Task Based Model for Remaining Useful Life Prediction with Missing Values
    Zhang, Kai
    Liu, Ruonan
    MACHINES, 2022, 10 (09)
  • [39] Remaining Useful Life Prediction for Equipment Using Residual Network and Convolutional Attention Mechanism
    Mo R.
    Li T.
    Si X.
    Zhu X.
    Hsi-An Chiao Tung Ta Hsueh/Journal of Xi'an Jiaotong University, 2022, 56 (04): : 194 - 202
  • [40] Multi-head self-attention bidirectional gated recurrent unit for end-to-end remaining useful life prediction of mechanical equipment
    Che, Changchang
    Wang, Huawei
    Ni, Xiaomei
    Xiong, Minglan
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2022, 33 (11)