Cascade Feature Selection and ELM for automatic fault diagnosis of the Tennessee Eastman process

被引:19
|
作者
Boldt, Francisco de Assis [1 ]
Rauber, Thomas W. [1 ]
Varejao, Flavio M. [1 ]
机构
[1] Univ Fed Espirito Santo, Dept Informat, Vitoria, Spain
关键词
Fault diagnosis; Feature selection; Extreme Learning Machine; Tennessee Eastman process; EXTREME LEARNING-MACHINE; ALGORITHMS;
D O I
10.1016/j.neucom.2017.02.025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work presents the concept of Cascade Feature Selection to combine feature selection methods. Fast and weak methods, like ranking, are placed on the top of the cascade to reduce the dimensionality of the initial feature set. Thus, strong and computationally demanding methods; placed on the bottom of the cascade, have to deal with less features. Three cascade combinations are tested with the Extreme Learning Machine as the underlying classification architecture. The Tennessee Eastman chemical process simulation software and one high-dimensional data set are used as sources of the benchmark data. Experimental results suggest that the cascade arrangement can produce smaller final feature subsets, expending less time, with higher classification performances than a feature selection based on a Genetic Algorithm. Many works in the literature have proposed mixed methods with specific combination strategies. The main contribution of this work is a concept able to combine any existent method using a single strategy. Provided that the Cascade Feature Selection requirements are fulfilled, the combinations might reduce the time to select features or increase the classification performance of the classifiers trained with the selected features. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:238 / 248
页数:11
相关论文
共 50 条
  • [1] Feature selection for fault detection systems: application to the Tennessee Eastman process
    Chebel-Morello, Brigitte
    Malinowski, Simon
    Senoussi, Hafida
    APPLIED INTELLIGENCE, 2016, 44 (01) : 111 - 122
  • [2] Feature selection for fault detection systems: application to the Tennessee Eastman process
    Brigitte Chebel-Morello
    Simon Malinowski
    Hafida Senoussi
    Applied Intelligence, 2016, 44 : 111 - 122
  • [3] Unsupervised Feature Selection Based on Fuzzy Clustering for Fault Detection of the Tennessee Eastman Process
    Bedoya, C.
    Uribe, C.
    Isaza, C.
    ADVANCES IN ARTIFICIAL INTELLIGENCE - IBERAMIA 2012, 2012, 7637 : 350 - 360
  • [4] Fault diagnosis in Tennessee Eastman process using slow feature principal component analysis
    Gu S.
    Liu Y.
    Zhang N.
    Recent Innovations in Chemical Engineering, 2016, 9 (01) : 49 - 61
  • [5] Generalized transformer in fault diagnosis of Tennessee Eastman process
    Lei Zhang
    Zhihuan Song
    Qinghua Zhang
    Zhiping Peng
    Neural Computing and Applications, 2022, 34 : 8575 - 8585
  • [6] Generalized transformer in fault diagnosis of Tennessee Eastman process
    Zhang, Lei
    Song, Zhihuan
    Zhang, Qinghua
    Peng, Zhiping
    Neural Computing and Applications, 2022, 34 (11) : 8575 - 8585
  • [7] Evaluation of the Extreme Learning Machine for Automatic Fault Diagnosis of the Tennessee Eastman Chemical Process
    Boldt, Francisco de Assis
    Rauber, Thomas W.
    Varejao, Flavio M.
    IECON 2014 - 40TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2014, : 2551 - 2557
  • [8] Generalized transformer in fault diagnosis of Tennessee Eastman process
    Zhang, Lei
    Song, Zhihuan
    Zhang, Qinghua
    Peng, Zhiping
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8575 - 8585
  • [9] A Fault Diagnosis Model for Tennessee Eastman Processes Based on Feature Selection and Probabilistic Neural Network
    Xu, Haoxiang
    Ren, Tongyao
    Mo, Zhuangda
    Yang, Xiaohui
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [10] Fault diagnosis with bayesian networks: Application to the Tennessee Eastman Process
    Verron, Sylvain
    Tiplica, Teodor
    Kobi, Abdessamad
    2006 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY, VOLS 1-6, 2006, : 1117 - +