Kullback-Leibler Divergence for fault estimation and isolation : Application to Gamma distributed data

被引:32
|
作者
Delpha, Claude [1 ]
Diallo, Demba [2 ]
Youssef, Abdulrahman
机构
[1] Univ Paris Sud, CNRS, Cent Supelec, L2S, F-91192 Gif Sur Yvette, France
[2] Univ Paris 06, Univ Paris Sud, CNRS, Cent Supelec,Lab Genie Elect & Elect Paris GeePs, F-91192 Gif Sur Yvette, France
关键词
Fault detection and diagnosis; Incipient fault estimation; Gamma distributed data; Kullback-Leibler Divergence; Principal Component Analysis; PRINCIPAL COMPONENT ANALYSIS; ROLLING ELEMENT BEARINGS; DATA RECONCILIATION; QUANTITATIVE MODEL; DATA-DRIVEN; PART I; DIAGNOSIS; VALIDATION;
D O I
10.1016/j.ymssp.2017.01.045
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
In this paper we develop a fault detection, isolation and estimation method based on data driven approach. Data-driven methods are effective for feature extraction and feature analysis using statistical techniques. In the proposal, the Principal Component Analysis (PCA) method is used to extract the features and to reduce the data dimension. Then, the Kullback-Leibler Divergence (KLD) is used to detect the fault occurrence by comparing the Probability Density Function of the latent scores. To estimate the fault amplitude in case of Gamma distributed data, we have developed an analytical model that links the KLD to the fault severity, including the environmental noise conditions. In the Principal Component Analysis framework, the proposed model of the KLD has been analysed and compared to an estimated value of the KLD using the Monte-Carlo estimator. The results show that for incipient faults (<10%) in usual noise conditions (SNR > 40 dB), the fault amplitude estimation is accurate enough with a relative error less than 1%. The proposed approach is experimentally verified with vibration signals used for monitoring bearings in electrical machines. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:118 / 135
页数:18
相关论文
共 50 条
  • [21] Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence
    Fan Zhang
    Yu Liu
    Chujie Chen
    Yan-Feng Li
    Hong-Zhong Huang
    Journal of Mechanical Science and Technology, 2014, 28 : 4441 - 4454
  • [22] Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence
    Zhang, Fan
    Liu, Yu
    Chen, Chujie
    Li, Yan-Feng
    Huang, Hong-Zhong
    JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2014, 28 (11) : 4441 - 4454
  • [23] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [24] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [25] Kullback-Leibler divergence for evaluating bioequivalence
    Dragalin, V
    Fedorov, V
    Patterson, S
    Jones, B
    STATISTICS IN MEDICINE, 2003, 22 (06) : 913 - 930
  • [26] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046
  • [27] Kullback-Leibler divergence: A quantile approach
    Sankaran, P. G.
    Sunoj, S. M.
    Nair, N. Unnikrishnan
    STATISTICS & PROBABILITY LETTERS, 2016, 111 : 72 - 79
  • [28] Estimation of discrepancy of color qualia using Kullback-Leibler divergence
    Yamada, Miku
    Matsumoto, Miu
    Arakaki, Mina
    Hebishima, Hana
    Inage, Shinichi
    BIOSYSTEMS, 2023, 232
  • [29] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [30] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545