Feature Selection based on the Kullback-Leibler Distance and its application on fault diagnosis

被引:5
|
作者
Xue, Yangtao
Zhang, Li [1 ]
Wang, Bangjun
Li, Fanzhang
机构
[1] Soochow Univ, Dept Comp Sci & Technol, Suzhou, Peoples R China
关键词
feature selection; fault diagnosis; Kullback-Leibler distance; GENERALIZED GAUSSIAN DENSITY; DIVERGENCE; MODELS;
D O I
10.1109/CBD.2019.00052
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The core concept of pattern recognition is that digs inner mode between data in the same class. The within-class data has a similar distribution, while between-class data has some distinction in different forms. Feature selection utilizes the difference between two-class data to reduce the number of features in the training models. A large amount of feature selection methods have widely used in different fields. This paper proposes a novel feature selection method based on the Kullback-Leibler distance which measures the distance of distribution between two features. For fault diagnosis, the proposed feature selection method is combined with support vector machine to improve its performance. Experimental results validate the effectiveness and superior of the proposed feature selection method, and the proposed diagnosis model can increase the detection rate in chemistry process.
引用
收藏
页码:246 / 251
页数:6
相关论文
共 50 条
  • [21] Feature Selection Algorithm for Hierarchical Text Classification Using Kullback-Leibler Divergence
    Yao Lifang
    Qin Sijun
    Zhu Huan
    [J]. 2017 2ND IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA 2017), 2017, : 421 - 424
  • [22] RADAR HRRP RECOGNITION BASED ON THE MINIMUM KULLBACK-LEIBLER DISTANCE CRITERION
    Yuan Li Liu Hongwei Bao Zheng (Key Lab for Radar Signal Processing
    [J]. Journal of Electronics(China), 2007, (02) : 199 - 203
  • [23] Segmentation of SAR image based on Kullback-Leibler distance and regular tessellation
    Zhao, Quan-Hua
    Gao, Jun
    Zhao, Xue-Mei
    Li, Yu
    [J]. Kongzhi yu Juece/Control and Decision, 2018, 33 (10): : 1767 - 1774
  • [24] ONLINE INCIPIENT FAULT DIAGNOSIS BASED ON KULLBACK-LEIBLER DIVERGENCE AND RECURSIVE PRINCIPLE COMPONENT ANALYSIS
    Chai, Yi
    Tao, Songbing
    Mao, Wanbiao
    Zhang, Ke
    Zhu, Zhiqin
    [J]. CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2018, 96 (02): : 426 - 433
  • [25] Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
    Kubokawa, Tatsuya
    Tsukuma, Hisayuki
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2007, 137 (07) : 2487 - 2508
  • [26] Sensor and Actuator Fault Diagnosis for a Multi-Robot System Based on the Kullback-Leibler Divergence
    Abci, Boussad
    El Najjar, Maan El Badaoui
    Cocquempot, Vincent
    [J]. 2019 4TH CONFERENCE ON CONTROL AND FAULT TOLERANT SYSTEMS (SYSTOL), 2019, : 68 - 73
  • [27] Kullback-Leibler Divergence for fault estimation and isolation : Application to Gamma distributed data
    Delpha, Claude
    Diallo, Demba
    Youssef, Abdulrahman
    [J]. MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2017, 93 : 118 - 135
  • [28] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    [J]. JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [29] Kullback-Leibler Divergence based Graph Pruning in Robotic Feature Mapping
    Wang, Yue
    Xiong, Rong
    Li, Qianshan
    Huang, Shoudong
    [J]. 2013 EUROPEAN CONFERENCE ON MOBILE ROBOTS (ECMR 2013), 2013, : 32 - 37
  • [30] An improved incipient fault detection method based on Kullback-Leibler divergence
    Chen, Hongtian
    Jiang, Bin
    Lu, Ningyun
    [J]. ISA TRANSACTIONS, 2018, 79 : 127 - 136