Stochastic mutual information gradient estimation for dimensionality reduction networks

被引:0
|
作者
Oezdenizci, Ozan [1 ,2 ,3 ]
Erdogmus, Deniz [1 ]
机构
[1] Northeastern Univ, Dept Elect & Comp Engn, Boston, MA 02115 USA
[2] Graz Univ Technol, Inst Theoret Comp Sci, Graz, Austria
[3] Graz Univ Technol, SAL Dependable Embedded Syst Lab, Silicon Austria Labs, Graz, Austria
关键词
Feature projection; Dimensionality reduction; Neural networks; Information theoretic learning; Mutual information; Stochastic gradient estimation; MMINet; FEATURE-SELECTION; FEATURE-EXTRACTION; CLASSIFICATION; PROBABILITY; EEG;
D O I
10.1016/j.ins.2021.04.066
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature ranking and selection is a widely used approach in various applications of supervised dimensionality reduction in discriminative machine learning. Nevertheless there exists significant evidence on feature ranking and selection algorithms based on any criterion leading to potentially sub-optimal solutions for class separability. In that regard, we introduce emerging information theoretic feature transformation protocols as an end-to end neural network training approach. We present a dimensionality reduction network (MMINet) training procedure based on the stochastic estimate of the mutual information gradient. The network projects high-dimensional features onto an output feature space where lower dimensional representations of features carry maximum mutual information with their associated class labels. Furthermore, we formulate the training objective to be estimated non-parametrically with no distributional assumptions. We experimentally evaluate our method with applications to high-dimensional biological data sets, and relate it to conventional feature selection algorithms to form a special case of our approach. (c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:298 / 305
页数:8
相关论文
共 50 条
  • [1] Mutual Information Based Output Dimensionality Reduction
    Pandey, Shishir
    Vaze, Rahul
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, : 935 - 940
  • [2] Quadratic mutual information for dimensionality reduction and classification
    Gray, David M.
    Principe, Jose C.
    [J]. AUTOMATIC TARGET RECOGNITION XX; ACQUISITION, TRACKING, POINTING, AND LASER SYSTEMS TECHNOLOGIES XXIV; AND OPTICAL PATTERN RECOGNITION XXI, 2010, 7696
  • [3] Dimensionality Reduction by Mutual Information for Text Classification
    刘丽珍
    宋瀚涛
    陆玉昌
    [J]. Journal of Beijing Institute of Technology, 2005, (01) : 32 - 36
  • [4] Multifactor dimensionality reduction using normalized mutual information
    Bush, W. S.
    Edwards, T. L.
    Dudek, S. M.
    Ritchie, M. D.
    [J]. GENETIC EPIDEMIOLOGY, 2007, 31 (05) : 464 - 464
  • [5] Text Dimensionality Reduction with Mutual Information Preserving Mapping
    Yang Zhen
    Yao Fei
    Fan Kefeng
    Huang Jian
    [J]. CHINESE JOURNAL OF ELECTRONICS, 2017, 26 (05) : 919 - 925
  • [6] Text Dimensionality Reduction with Mutual Information Preserving Mapping
    YANG Zhen
    YAO Fei
    FAN Kefeng
    HUANG Jian
    [J]. Chinese Journal of Electronics, 2017, 26 (05) : 919 - 925
  • [7] Dimensionality reduction in stochastic complex dynamical networks
    Tu, Chengyi
    Luo, Jianhong
    Fan, Ying
    Pan, Xuwei
    [J]. CHAOS SOLITONS & FRACTALS, 2023, 175
  • [8] Information Preserving Dimensionality Reduction for Mutual Information Analysis of Deep Learning
    Namekawa, Shizuma
    Tezuka, Taro
    [J]. DCC 2022: 2022 DATA COMPRESSION CONFERENCE (DCC), 2022, : 477 - 477
  • [9] Bias reduction in the estimation of mutual information
    Zhu, Jie
    Bellanger, Jean-Jacques
    Shu, Huazhong
    Yang, Chunfeng
    Jeannes, Regine Le Bouquin
    [J]. PHYSICAL REVIEW E, 2014, 90 (05):
  • [10] DIMENSIONALITY REDUCTION FOR EEG CLASSIFICATION USING MUTUAL INFORMATION AND SVM
    Guerrero-Mosquera, Carlos
    Verleysen, Michel
    Navia Vazquez, Angel
    [J]. 2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2011,