Learning from Examples with Information Theoretic Criteria

被引:0
|
作者
Jose C. Principe
Dongxin Xu
Qun Zhao
John W. Fisher
机构
[1] University of Florida,Computational NeuroEngineering Laboratory
关键词
Entropy; Support Vector Machine; Mutual Information; Synthetic Aperture Radar; Independent Component Analysis;
D O I
暂无
中图分类号
学科分类号
摘要
This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).
引用
收藏
页码:61 / 77
页数:16
相关论文
共 50 条
  • [1] Learning from examples with information theoretic criteria
    Principe, JC
    Xu, DX
    Zhao, Q
    Fisher, JW
    JOURNAL OF VLSI SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2000, 26 (1-2): : 61 - 77
  • [2] Improved lower bounds for learning from noisy examples: An information-theoretic approach
    Gentile, C
    Helmbold, DP
    INFORMATION AND COMPUTATION, 2001, 166 (02) : 133 - 155
  • [3] Adaptive-Size Dictionary Learning Using Information Theoretic Criteria
    Dumitrescu, Bogdan
    Giurcaneanu, Ciprian Doru
    ALGORITHMS, 2019, 12 (09)
  • [4] Size Adaptation of Separable Dictionary Learning with Information-Theoretic Criteria
    Baltoiu, Andra
    Dumitrescu, Bogdan
    2019 22ND INTERNATIONAL CONFERENCE ON CONTROL SYSTEMS AND COMPUTER SCIENCE (CSCS), 2019, : 7 - 11
  • [5] Information Theoretic Criteria for Community Detection
    Branting, L. Karl
    ADVANCES IN SOCIAL NETWORK MINING AND ANALYSIS, 2010, 5498 : 114 - 130
  • [6] DETECTION OF SIGNALS BY INFORMATION THEORETIC CRITERIA
    WAX, M
    KAILATH, T
    IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1985, 33 (02): : 387 - 392
  • [7] On the value of partial information for learning from examples
    Ratsaby, J
    Maiorov, V
    JOURNAL OF COMPLEXITY, 1997, 13 (04) : 509 - 544
  • [8] Learning information extraction patterns from examples
    Lect Notes Artif Intell, (246):
  • [9] Learning from examples with quadratic mutual information
    Xu, DX
    Principe, JC
    NEURAL NETWORKS FOR SIGNAL PROCESSING VIII, 1998, : 155 - 164
  • [10] Image segmentation using information theoretic criteria
    Hibbard, LS
    MEDICAL IMAGING 2003: IMAGE PROCESSING, PTS 1-3, 2003, 5032 : 1639 - 1649