A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning

被引:47
|
作者
Xu, Jian-Wu [1 ]
Paiva, Antonio R. C. [1 ]
Park , Il [1 ]
Principe, Jose C. [1 ]
机构
[1] Univ Florida, Dept Elect & Comp Engn, Computat NeuroEngn Lab, Gainesville, FL 32611 USA
基金
美国国家科学基金会;
关键词
Cross-information potential; information-theoretic learning (ITL); kernel function; probability density function; reproducing kernel Hilbert space (RKHS);
D O I
10.1109/TSP.2008.2005085
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper provides a functional analysis perspective of information-theoretic learning (ITL) by defining bottom-up a reproducing kernel Hilbert space (RKHS) uniquely determined by the symmetric nonnegative definite kernel function known as the cross-information potential (CIP). The CIP as an integral of the product of two probability density functions characterizes similarity between two stochastic functions. We prove the existence of a one-to-one congruence mapping between the ITL RKHS and the Hilbert space spanned by square integrable probability density functions. Therefore, all the statistical descriptors in the original information-theoretic learning formulation can be rewritten as algebraic computations on deterministic functional vectors in the ITL RKHS, instead of limiting the functional view to the estimators as is commonly done in kernel methods. A connection between the ITL RKHS and kernel approaches interested in quantifying the statistics of the projected data is also established.
引用
收藏
页码:5891 / 5902
页数:12
相关论文
共 50 条
  • [1] A Reproducing Kernel Hilbert Space Framework for Functional Classification
    Sang, Peijun
    Kashlak, Adam B.
    Kong, Linglong
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2023, 32 (03) : 1000 - 1008
  • [2] An Information-Theoretic Framework for Deep Learning
    Jeon, Hong Jun
    Van Roy, Benjamin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [3] Information-Theoretic Dataset Selection for Fast Kernel Learning
    Paiva, Antonio R. C.
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2088 - 2095
  • [4] A REPRODUCING KERNEL HILBERT SPACE FORMULATION OF THE PRINCIPLE OF RELEVANT INFORMATION
    Giraldo, Luis G. Sanchez
    Principe, Jose C.
    2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2011,
  • [5] Distributed Learning of Conditional Quantiles in the Reproducing Kernel Hilbert Space
    Lian, Heng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [6] SPARSE LEARNING OF PARSIMONIOUS REPRODUCING KERNEL HILBERT SPACE MODELS
    Peifer, Maria
    Chamon, Luiz F. O.
    Paternain, Santiago
    Ribeiro, Alejandro
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3292 - 3296
  • [7] An Example of a Reproducing Kernel Hilbert Space
    Tutaj, Edward
    COMPLEX ANALYSIS AND OPERATOR THEORY, 2019, 13 (01) : 193 - 221
  • [8] An Example of a Reproducing Kernel Hilbert Space
    Edward Tutaj
    Complex Analysis and Operator Theory, 2019, 13 : 193 - 221
  • [9] A Reproducing Kernel Hilbert Space Framework for Spike Train Signal Processing
    Paiva, Antonio R. C.
    Park, Il
    Principe, Jose C.
    NEURAL COMPUTATION, 2009, 21 (02) : 424 - 449
  • [10] Information-Theoretic Transfer Learning Framework for Bayesian Optimisation
    Ramachandran, Anil
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 827 - 842