Kullback-Leibler Divergence for fault estimation and isolation : Application to Gamma distributed data

被引:32
|
作者
Delpha, Claude [1 ]
Diallo, Demba [2 ]
Youssef, Abdulrahman
机构
[1] Univ Paris Sud, CNRS, Cent Supelec, L2S, F-91192 Gif Sur Yvette, France
[2] Univ Paris 06, Univ Paris Sud, CNRS, Cent Supelec,Lab Genie Elect & Elect Paris GeePs, F-91192 Gif Sur Yvette, France
关键词
Fault detection and diagnosis; Incipient fault estimation; Gamma distributed data; Kullback-Leibler Divergence; Principal Component Analysis; PRINCIPAL COMPONENT ANALYSIS; ROLLING ELEMENT BEARINGS; DATA RECONCILIATION; QUANTITATIVE MODEL; DATA-DRIVEN; PART I; DIAGNOSIS; VALIDATION;
D O I
10.1016/j.ymssp.2017.01.045
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
In this paper we develop a fault detection, isolation and estimation method based on data driven approach. Data-driven methods are effective for feature extraction and feature analysis using statistical techniques. In the proposal, the Principal Component Analysis (PCA) method is used to extract the features and to reduce the data dimension. Then, the Kullback-Leibler Divergence (KLD) is used to detect the fault occurrence by comparing the Probability Density Function of the latent scores. To estimate the fault amplitude in case of Gamma distributed data, we have developed an analytical model that links the KLD to the fault severity, including the environmental noise conditions. In the Principal Component Analysis framework, the proposed model of the KLD has been analysed and compared to an estimated value of the KLD using the Monte-Carlo estimator. The results show that for incipient faults (<10%) in usual noise conditions (SNR > 40 dB), the fault amplitude estimation is accurate enough with a relative error less than 1%. The proposed approach is experimentally verified with vibration signals used for monitoring bearings in electrical machines. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:118 / 135
页数:18
相关论文
共 50 条
  • [1] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [2] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    MATHEMATICS, 2021, 9 (05) : 1 - 36
  • [3] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [4] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [5] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [6] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [7] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [8] ENHANCEMENT OF INCIPIENT FAULT DETECTION AND ESTIMATION USING THE MULTIVARIATE KULLBACK-LEIBLER DIVERGENCE
    Youssef, Abdulrahman
    Delpha, Claude
    Diallo, Demba
    2016 24TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2016, : 1408 - 1412
  • [9] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [10] Distributed Vector Quantization Based on Kullback-Leibler Divergence
    Shen, Pengcheng
    Li, Chunguang
    Luo, Yiliang
    ENTROPY, 2015, 17 (12) : 7875 - 7887