Learning from Examples with Information Theoretic Criteria

被引:0
|
作者
Jose C. Principe
Dongxin Xu
Qun Zhao
John W. Fisher
机构
[1] University of Florida,Computational NeuroEngineering Laboratory
关键词
Entropy; Support Vector Machine; Mutual Information; Synthetic Aperture Radar; Independent Component Analysis;
D O I
暂无
中图分类号
学科分类号
摘要
This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).
引用
收藏
页码:61 / 77
页数:16
相关论文
共 50 条
  • [21] Information-Theoretic Odometry Learning
    Sen Zhang
    Jing Zhang
    Dacheng Tao
    International Journal of Computer Vision, 2022, 130 : 2553 - 2570
  • [22] Information-theoretic competitive learning
    Kamimura, R
    IASTED: PROCEEDINGS OF THE IASTED INTERNATIONAL CONFERENCE ON MODELLING AND SIMULATION, 2003, : 359 - 365
  • [23] Information theoretic learning with adaptive kernels
    Singh, Abhishek
    Principe, Jose C.
    SIGNAL PROCESSING, 2011, 91 (02) : 203 - 213
  • [24] Information-Theoretic Odometry Learning
    Zhang, Sen
    Zhang, Jing
    Tao, Dacheng
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2022, 130 (11) : 2553 - 2570
  • [25] INFORMATION THEORETIC STRUCTURE LEARNING WITH CONFIDENCE
    Moon, Kevin R.
    Noshad, Morteza
    Sekeh, Salimeh Yasaei
    Hero, Alfred O., III
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 6095 - 6099
  • [26] Information Theoretic Counterfactual Learning from Missing-Not-At-Random Feedback
    Wang, Zifeng
    Chen, Xi
    Wen, Rui
    Huang, Shao-Lun
    Kuruoglu, Ercan E.
    Zheng, Yefeng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [27] Time-of-Arrival Estimation Based on Information Theoretic Criteria
    Giorgetti, Andrea
    Chiani, Marco
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (08) : 1869 - 1879
  • [28] Blind Spectrum Sensing by Information Theoretic Criteria for Cognitive Radios
    Wang, Rui
    Tao, Meixia
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2010, 59 (08) : 3806 - 3817
  • [29] A design of information theoretic criteria for detecting the number of incoherent signals
    Suzuki, Masakiyo
    Zhang, Ming
    Chen, Haihua
    Teng, Tingting
    2006 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATIONS, VOLS 1 AND 2, 2006, : 854 - +
  • [30] Forced information and information loss in information-theoretic competitive learning
    Kamimura, Ryotaro
    PROCEEDINGS OF THE IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND APPLICATIONS, 2007, : 69 - 74