Learning from Examples with Information Theoretic Criteria

被引:0
|
作者
Jose C. Principe
Dongxin Xu
Qun Zhao
John W. Fisher
机构
[1] University of Florida,Computational NeuroEngineering Laboratory
关键词
Entropy; Support Vector Machine; Mutual Information; Synthetic Aperture Radar; Independent Component Analysis;
D O I
暂无
中图分类号
学科分类号
摘要
This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).
引用
收藏
页码:61 / 77
页数:16
相关论文
共 50 条
  • [41] Reinforcement Learning with Information-Theoretic Actuation
    Catt, Elliot
    Hutter, Marcus
    Veness, Joel
    ARTIFICIAL GENERAL INTELLIGENCE, AGI 2022, 2023, 13539 : 188 - 198
  • [42] Learning an information theoretic transform for object detection
    Fang, JZ
    Qiu, GP
    IMAGE ANALYSIS AND RECOGNITION, PT 1, PROCEEDINGS, 2004, 3211 : 503 - 510
  • [43] Learning Coefficients and Information Criteria
    Aoyagi, Miki
    FUZZY SYSTEMS AND DATA MINING V (FSDM 2019), 2019, 320 : 351 - 362
  • [44] An information theoretic score for learning hierarchical concepts
    Madani, Omid
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2023, 17
  • [45] An Information Theoretic ART for Robust Unsupervised Learning
    Brito da Silva, Leonardo Enzo
    Wunsch, Donald C., II
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3023 - 3029
  • [46] An Information Theoretic Framework for Distributed Learning Algorithms
    Xu, Xiangxiang
    Huang, Shao-Lun
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 314 - 319
  • [47] Ant Clustering Algorithm with Information Theoretic Learning
    Boryczka, Urszula
    Boryczka, Mariusz
    INTELLIGENT DECISION TECHNOLOGIES 2016, PT I, 2016, 56 : 325 - 335
  • [48] Information-theoretic approach to interactive learning
    Still, S.
    EPL, 2009, 85 (02)
  • [49] Bayesian/information theoretic model of bias learning
    Baxter, Jonathan
    Proceedings of the Annual ACM Conference on Computational Learning Theory, 1996, : 77 - 88
  • [50] An Information Theoretic Learning for Causal Direction Identification
    Wu, Hang
    Wang, May D.
    2020 IEEE 44TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2020), 2020, : 287 - 294