Data-Dimensionality Reduction Using Information-Theoretic Stepwise Feature Selector

被引:0
|
作者
Joshi, Alok A. [1 ]
Meckl, Peter [2 ]
King, Galen [2 ]
Jennings, Kristofer [3 ]
机构
[1] Cummins Inc, Columbus, IN 47201 USA
[2] Purdue Univ, Sch Mech Engn, Ray W Herrick Labs, W Lafayette, IN 47907 USA
[3] Purdue Univ, Coll Sci, Dept Stat, W Lafayette, IN 47907 USA
关键词
MUTUAL INFORMATION; MODEL;
D O I
10.1115/1.3023112
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A novel information-theoretic stepwise feature selector (ITSFS) is designed to reduce the dimension of diesel engine data. This data consist of 43 sensor measurements acquired from diesel engines that arc either in a healthy state or in one of seven different fault states. Using ITSFS, the minimum number of sensors from a pool of 43 sensors is selected so that eight states of the engine can be classified with reasonable accuracy. Various classifiers are trained and tested for fault classification accuracy using the field data before and after dimension reduction by ITSFS. The process of dimension reduction and classification is repeated using other existing dimension reduction techniques such as simulated annealing and regression subset selection. The classification accuracies from these techniques are compared with those obtained by data reduced by the proposed feature selector. [DOI: 10.1115/1.3023112]
引用
收藏
页码:1 / 5
页数:5
相关论文
共 50 条
  • [1] An Information-theoretic approach to dimensionality reduction in data science
    Mainali, Sambriddhi
    Garzon, Max
    Venugopal, Deepak
    Jana, Kalidas
    Yang, Ching-Chi
    Kumar, Nirman
    Bowman, Dale
    Deng, Lih-Yuan
    [J]. INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2021, 12 (03) : 185 - 203
  • [2] An Information-theoretic approach to dimensionality reduction in data science
    Sambriddhi Mainali
    Max Garzon
    Deepak Venugopal
    Kalidas Jana
    Ching-Chi Yang
    Nirman Kumar
    Dale Bowman
    Lih-Yuan Deng
    [J]. International Journal of Data Science and Analytics, 2021, 12 : 185 - 203
  • [3] Feature extraction using information-theoretic learning
    Hild, Kenneth E., II
    Erdogmus, Deniz
    Torkkola, Kari
    Principe, Jose C.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2006, 28 (09) : 1385 - 1392
  • [4] Information-Theoretic Feature Selection in Microarray Data Using Variable Complementarity
    Meyer, Patrick Emmanuel
    Schretter, Colas
    Bontempi, Gianluca
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2008, 2 (03) : 261 - 274
  • [5] Stepwise optimal feature selection for data dimensionality reduction
    Qin, Lifeng
    He, Dongjian
    Long, Yan
    [J]. Journal of Computational Information Systems, 2015, 11 (05): : 1647 - 1656
  • [6] Dimensionality reduction and information-theoretic divergence between sets of LADAR images
    Gray, David M.
    Principe, Jose C.
    [J]. AUTOMATIC TARGET RECOGNITION XVIII, 2008, 6967
  • [7] Information-theoretic feature selection for functional data classification
    Gomez-Verdejo, Vanessa
    Verleysen, Michel
    Fleury, Jerome
    [J]. NEUROCOMPUTING, 2009, 72 (16-18) : 3580 - 3589
  • [8] Data poisoning against information-theoretic feature selection
    Liu, Heng
    Ditzler, Gregory
    [J]. INFORMATION SCIENCES, 2021, 573 : 396 - 411
  • [9] Distributed Information Gain Theoretic Feature Selector using Spark
    Prasad, Bakshi Rohit
    Bendale, Unmesh Kishor
    Agarwal, Sonali
    [J]. 2016 11TH INTERNATIONAL CONFERENCE ON INDUSTRIAL AND INFORMATION SYSTEMS (ICIIS), 2016, : 804 - 809
  • [10] The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction
    Williamson, Ross S.
    Sahani, Maneesh
    Pillow, Jonathan W.
    [J]. PLOS COMPUTATIONAL BIOLOGY, 2015, 11 (04)