Relation between fault characteristic frequencies and local interpretability shapley additive explanations for continuous machine health monitoring

被引:0
|
作者
Yan, Tongtong [1 ]
Xing, Xueqi [1 ]
Xia, Tangbin [2 ,3 ]
Wang, Dong [2 ,3 ]
机构
[1] Univ Western Ontario, Dept Mech & Mat Engn, London, ON, Canada
[2] Shanghai Jiao Tong Univ, State Key Lab Mech Syst & Vibrat, Shanghai, Peoples R China
[3] Shanghai Jiao Tong Univ, Sch Mech Engn, Dept Ind Engn & Management, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
Shapley additive explanations; Fault characteristic frequencies; Machine health monitoring; Health indicator; Signal filtering method; CONVOLUTIONAL NEURAL-NETWORK;
D O I
10.1016/j.engappai.2024.109046
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, the Shapley additive explanations models have been extensively studied to enhance explainability of artificial intelligence algorithms, while most of them simply use Shapley additive explanations to rank or measure the importance of different features. In this study, a novel methodology that studies the relation between fault characteristic frequencies and Shapley values generated by local interpretability Shapley additive explanations for machine health monitoring is proposed. Firstly, a simulation model is introduced to generate vibration signals at different health conditions and their spectral amplitudes transformed from Fourier transform are used to investigate the relationship between fault characteristic frequencies and local interpretability Shapley values. It is interestingly found that Shapley values can be used to locate fault characteristic frequencies. Moreover, most of them have negative values in a normal stage and have positive values in an abnormal stage. Based on this finding and Shapley additive explanations, a health indicator construction methodology is proposed to continuously monitor incipient machine faults. Subsequently, an automatic signal filtering method is proposed to remove and eliminate burrs and noise in Shapley values so that fault characteristic frequencies can be clearly revealed by Shapley values for physical fault diagnosis. Two run-to-failure cases are conducted to demonstrate the effectiveness of the proposed methodology and then the superiority of this study is demonstrated by comparing with existing methods for health indicator construction and fault diagnosis, including sparsity parameters, Hjorth parameters, and fast Kurtogram. Comparison results show that the proposed health indicator is more sensitive to the time of incipient fault initiation and interpretable fault diagnosis based on Shapley values has a robust performance. This study first sheds a light on the relationship between fault characteristic frequencies and Shapley values under the scenario of continuous machine health monitoring and seamlessly guides applicants to realize Shapley additive explanations based incipient fault detection and diagnosis.
引用
收藏
页数:20
相关论文
共 10 条
  • [1] Prediction of Biodiesel Yield Employing Machine Learning: Interpretability Analysis via Shapley Additive Explanations
    Agrawal, Pragati
    Gnanaprakash, R.
    Dhawane, Sumit H.
    [J]. FUEL, 2024, 359
  • [2] Prediction of HHV of fuel by Machine learning Algorithm: Interpretability analysis using Shapley Additive Explanations (SHAP)
    Timilsina, Manish Sharma
    Sen, Subhadip
    Uprety, Bibek
    Patel, Vashishtha B.
    Sharma, Prateek
    Sheth, Pratik N.
    [J]. FUEL, 2024, 357
  • [3] Prediction of HHV of fuel by Machine learning Algorithm: Interpretability analysis using Shapley Additive Explanations (SHAP)
    Timilsina, Manish Sharma
    Sen, Subhadip
    Uprety, Bibek
    Patel, Vashishtha B.
    Sharma, Prateek
    Sheth, Pratik N.
    [J]. FUEL, 2024, 357
  • [4] Understanding Arteriosclerotic Heart Disease Patients Using Electronic Health Records: A Machine Learning and Shapley Additive exPlanations Approach
    Miranda, Eka
    Adiarto, Suko
    Bhatti, Faqir M.
    Zakiyyah, Alfi Yusrotis
    Aryuni, Mediana
    Bernando, Charles
    [J]. HEALTHCARE INFORMATICS RESEARCH, 2023, 29 (03) : 228 - 238
  • [5] Comparison of Explainable Machine-Learning Models for Decision-Making in Health Intensive Care Using SHapley Additive exPlanations
    Vidal, Igor Pereira
    Pereira, Marluce Rodrigues
    Freire, Andre Pimenta
    Resende, Uanderson
    Maziero, Erick Galani
    [J]. PROCEEDINGS OF THE 19TH BRAZILIAN SYMPOSIUM ON INFORMATION SYSTEMS, 2023, : 300 - 307
  • [6] Predicting water quality variables using gradient boosting machine: global versus local explainability using SHapley Additive Explanations (SHAP)
    Khaled Merabet
    Fabio Di Nunno
    Francesco Granata
    Sungwon Kim
    Rana Muhammad Adnan
    Salim Heddam
    Ozgur Kisi
    Mohammad Zounemat-Kermani
    [J]. Earth Science Informatics, 2025, 18 (3)
  • [7] Explainable machine learning techniques for hybrid nanofluids transport characteristics: an evaluation of shapley additive and local interpretable model-agnostic explanations
    Kanti, Praveen Kumar
    PrabhakarSharma
    Wanatasanappan, V. Vicki
    Said, Nejla Mahjoub
    [J]. Journal of Thermal Analysis and Calorimetry, 2024, 149 (21) : 11599 - 11618
  • [8] Breast cancer molecular subtype prediction: Improving interpretability of complex machine-learning models based on multiparametric-MRI features using SHapley Additive exPlanations (SHAP) methodology
    Crombe, Amandine
    Kataoka, Masako
    [J]. DIAGNOSTIC AND INTERVENTIONAL IMAGING, 2024, 105 (05) : 161 - 162
  • [9] LEADING PREDICTORS OF INCIDENT HYPERTENSION AMONG PATIENTS WITH CANCER IN COMMUNITY HEALTH CENTERS: A MACHINE LEARNING APPROACH WITH SHAPLEY ADDITIVE EXPLANATIONS USING ELECTRONIC HEALTH RECORDS
    Park, C.
    Han, S.
    Sambamoorthi, U.
    [J]. VALUE IN HEALTH, 2024, 27 (06) : S17 - S17
  • [10] Entropy-maximization oriented interpretable health indicators for locating informative fault frequencies for machine health monitoring
    Yan, Tongtong
    Wang, Dong
    Xia, Tangbin
    Zheng, Meimei
    Peng, Zhike
    Xi, Lifeng
    [J]. MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 198