Feature Selection based on the Kullback-Leibler Distance and its application on fault diagnosis

被引:5
|
作者
Xue, Yangtao
Zhang, Li [1 ]
Wang, Bangjun
Li, Fanzhang
机构
[1] Soochow Univ, Dept Comp Sci & Technol, Suzhou, Peoples R China
关键词
feature selection; fault diagnosis; Kullback-Leibler distance; GENERALIZED GAUSSIAN DENSITY; DIVERGENCE; MODELS;
D O I
10.1109/CBD.2019.00052
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The core concept of pattern recognition is that digs inner mode between data in the same class. The within-class data has a similar distribution, while between-class data has some distinction in different forms. Feature selection utilizes the difference between two-class data to reduce the number of features in the training models. A large amount of feature selection methods have widely used in different fields. This paper proposes a novel feature selection method based on the Kullback-Leibler distance which measures the distance of distribution between two features. For fault diagnosis, the proposed feature selection method is combined with support vector machine to improve its performance. Experimental results validate the effectiveness and superior of the proposed feature selection method, and the proposed diagnosis model can increase the detection rate in chemistry process.
引用
收藏
页码:246 / 251
页数:6
相关论文
共 50 条
  • [41] TESTING EXPONENTIALITY BASED ON KULLBACK-LEIBLER INFORMATION
    EBRAHIMI, N
    HABIBULLAH, M
    SOOFI, ES
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 1992, 54 (03) : 739 - 748
  • [42] Choice of estimators based on Kullback-Leibler risk
    Liquet, Benoit
    [J]. JOURNAL OF THE SFDS, 2010, 151 (01): : 38 - 57
  • [43] Bootstrap estimate of Kullback-Leibler information for model selection
    Shibata, R
    [J]. STATISTICA SINICA, 1997, 7 (02) : 375 - 394
  • [44] Kullback-Leibler distance optimization for artificial chemo-sensors
    Vergara, Alexander
    Muezzinoglu, Mehmet K.
    Rulkov, Nikolai
    Huerta, Ramon
    [J]. 2009 IEEE SENSORS, VOLS 1-3, 2009, : 1146 - 1150
  • [45] Measuring the Influence of Observations in HMMs Through the Kullback-Leibler Distance
    Perduca, Vittorio
    Nuel, Gregory
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2013, 20 (02) : 145 - 148
  • [46] Jackknife evaluation of uncertainty judgments aggregated by the Kullback-Leibler distance
    Lin, Shi-Woei
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2011, 218 (02) : 469 - 479
  • [47] Entropic Kullback-Leibler type distance measures for quantum distributions
    Laguna, Humberto G.
    Salazar, Saul J. C.
    Sagar, Robin P.
    [J]. INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, 2019, 119 (19)
  • [48] Multispectral change detection using multivariate Kullback-Leibler distance
    Jabari, Shabnam
    Rezaee, Mohammad
    Fathollahi, Fatemeh
    Zhang, Yun
    [J]. ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2019, 147 : 163 - 177
  • [49] Neighbourhood Models Induced by the Euclidean Distance and the Kullback-Leibler Divergence
    Montes, Ignacio
    [J]. INTERNATIONAL SYMPOSIUM ON IMPRECISE PROBABILITY: THEORIES AND APPLICATIONS, VOL 215, 2023, 215 : 367 - 378
  • [50] A Satellite Incipient Fault Detection Method Based on Decomposed Kullback-Leibler Divergence
    Zhang, Ge
    Yang, Qiong
    Li, Guotong
    Leng, Jiaxing
    Yan, Mubiao
    [J]. ENTROPY, 2021, 23 (09)