Nonlinear feature transforms using maximum mutual information

被引:23
|
作者
Torkkola, K [1 ]
机构
[1] Motorola Labs, Tempe, AZ 85284 USA
关键词
D O I
10.1109/IJCNN.2001.938809
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Finding the right features is an essential part of a pattern recognition system. This can be accomplished either by selection or by a transform from a larger number of "raw" features. In this work we learn non-linear dimension reducing discriminative transforms that are implemented as neural networks, either as radial basis function networks or as multilayer perceptrons. As the criterion, we use the joint mutual information (MI) between the class labels of training data and transformed features. Our measure of MI makes use of Renyi entropy as formulated by Principe et al. Resulting low-dimensional features enable a classifier to operate with less computational resources and memory without compromising the accuracy.
引用
收藏
页码:2756 / 2761
页数:6
相关论文
共 50 条
  • [1] DISCRIMINATIVE FEATURE TRANSFORMS USING DIFFERENCED MAXIMUM MUTUAL INFORMATION
    Delcroix, Marc
    Ogawa, Atsunori
    Watanabe, Shinji
    Nakatani, Tomohiro
    Nakamura, Atsushi
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 4753 - 4756
  • [2] Using The Maximum Mutual Information Criterion To Textural Feature Selection For Satellite Image Classification
    Kerroum, Mounir Ait
    Hammouch, Ahmed
    Aboutajdine, Driss
    Bellaachia, Abdelghani
    [J]. 2008 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS, VOLS 1-3, 2008, : 584 - +
  • [3] Feature Selection Using Maximum Feature Tree Embedded with Mutual Information and Coefficient of Variation for Bird Sound Classification
    Xu, Haifeng
    Zhang, Yan
    Liu, Jiang
    Lv, Danjv
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [4] Comparison of maximum entropy and minimal mutual information in a nonlinear setting
    Theis, FJ
    Bauer, C
    Lang, EW
    [J]. SIGNAL PROCESSING, 2002, 82 (07) : 971 - 980
  • [5] Feature evaluation using quadratic mutual information
    Xu, DM
    Principe, JC
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 459 - 463
  • [6] Feature ranking and best feature subset using mutual information
    Cang, S
    Partridge, D
    [J]. NEURAL COMPUTING & APPLICATIONS, 2004, 13 (03): : 175 - 184
  • [7] Feature ranking and best feature subset using mutual information
    Shuang Cang
    Derek Partridge
    [J]. Neural Computing & Applications, 2004, 13 : 175 - 184
  • [8] Nonlinear probit gene classification using mutual information and wavelet-based feature selection
    Zhou, XB
    Wang, XD
    Dougherty, ER
    [J]. JOURNAL OF BIOLOGICAL SYSTEMS, 2004, 12 (03) : 371 - 386
  • [9] Feature selection using Joint Mutual Information Maximisation
    Bennasar, Mohamed
    Hicks, Yulia
    Setchi, Rossitza
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (22) : 8520 - 8532
  • [10] Using Mutual Information for Feature Selection in Programmatic Advertising
    Ciesielczyk, Michal
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON INNOVATIONS IN INTELLIGENT SYSTEMS AND APPLICATIONS (INISTA), 2017, : 290 - 295