Learning from Examples with Information Theoretic Criteria

被引:0
|
作者
Jose C. Principe
Dongxin Xu
Qun Zhao
John W. Fisher
机构
[1] University of Florida,Computational NeuroEngineering Laboratory
关键词
Entropy; Support Vector Machine; Mutual Information; Synthetic Aperture Radar; Independent Component Analysis;
D O I
暂无
中图分类号
学科分类号
摘要
This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).
引用
收藏
页码:61 / 77
页数:16
相关论文
共 50 条
  • [31] MIRACLE: Multimedia information retrieval by analysing content and learning from examples
    Lei, ZB
    Ganapathy, SK
    Safranek, RJ
    FOURTH IEEE WORKSHOP ON APPLICATIONS OF COMPUTER VISION - WACV'98, PROCEEDINGS, 1998, : 272 - 273
  • [33] TRACK-TO-TRACK ASSOCIATION USING INFORMATION THEORETIC CRITERIA
    Hussein, Islam I.
    Roscoe, Christopher W. T.
    Wilkins, Matthew P.
    Schumacher, Paul W., Jr.
    ASTRODYNAMICS 2015, 2016, 156 : 203 - 212
  • [34] INFORMATION-THEORETIC CRITERIA FOR THE DESIGN OF COMPRESSIVE SUBSPACE CLASSIFIERS
    Nokleby, Matthew
    Rodrigues, Miguel
    Calderbank, Robert
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [35] Improvement of source number estimation based on information theoretic criteria
    Ye, Zhong-Fu
    Xiang, Li
    Xu, Xu
    Dianbo Kexue Xuebao/Chinese Journal of Radio Science, 2007, 22 (04): : 593 - 598
  • [36] On the selection of predictors by using greedy algorithms and information theoretic criteria
    Li, Fangyao
    Triggs, Christopher M. M.
    Giurcaneanu, Ciprian Doru
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2023, 65 (02) : 77 - 100
  • [37] INFORMATION-THEORETIC STABILITY AND EVOLUTION CRITERIA IN IRREVERSIBLE THERMODYNAMICS
    PFAFFELHUBER, E
    JOURNAL OF STATISTICAL PHYSICS, 1977, 16 (01) : 69 - 90
  • [38] Cognitive radio sensing information-theoretic criteria based
    Haddad, Majed
    Hayar, Aawatif Menouni
    Fetoui, Mohamed Hedi
    Debbah, Merouane
    2007 2ND INTERNATIONAL CONFERENCE ON COGNITIVE RADIO ORIENTED WIRELESS NETWORKS AND COMMUNICATIONS, 2007, : 241 - 244
  • [39] Ultrawideband channel modeling on the basis of information-theoretic criteria
    Schuster, Ulrich G.
    Bolcskei, Helmut
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2007, 6 (07) : 2464 - 2475
  • [40] Unifying cost and information in information-theoretic competitive learning
    Kamimura, R
    NEURAL NETWORKS, 2005, 18 (5-6) : 711 - 718