Kernel learning and optimization with Hilbert-Schmidt independence criterion

被引:33
|
作者
Wang, Tinghua [1 ,2 ]
Li, Wei [1 ]
机构
[1] Gannan Normal Univ, Sch Math & Comp Sci, Ganzhou 341000, Peoples R China
[2] Univ Technol Sydney, Fac Engn & Informat Technol, Ctr Artificial Intelligence, Decis Syst & E Serv Intelligence Lab, Broadway, NSW 2007, Australia
基金
中国国家自然科学基金;
关键词
Kernel method; Kernel learning; Hilbert-Schmidt independence criterion (HSIC); Statistical dependence; Gaussian kernel optimization; Classification; FEATURE-SELECTION; DEPENDENCE;
D O I
10.1007/s13042-017-0675-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Measures of statistical dependence between random variables have been successfully applied in many machine learning tasks, such as independent component analysis, feature selection, clustering and dimensionality reduction. The success is based on the fact that many existing learning tasks can be cast into problems of dependence maximization (or minimization). Motivated by this, we present a unifying view of kernel learning via statistical dependence estimation. The key idea is that good kernels should maximize the statistical dependence between the kernels and the class labels. The dependence is measured by the Hilbert-Schmidt independence criterion (HSIC), which is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces and is traditionally used to measure the statistical dependence between random variables. As a special case of kernel learning, we propose a Gaussian kernel optimization method for classification by maximizing the HSIC, where two forms of Gaussian kernels (spherical kernel and ellipsoidal kernel) are considered. Extensive experiments on real-world data sets from UCI benchmark repository validate the superiority of the proposed approach in terms of both prediction accuracy and computational efficiency.
引用
收藏
页码:1707 / 1717
页数:11
相关论文
共 50 条
  • [1] Kernel Learning with Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Li, Wei
    He, Xianwen
    [J]. PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 720 - 730
  • [2] Kernel learning and optimization with Hilbert–Schmidt independence criterion
    Tinghua Wang
    Wei Li
    [J]. International Journal of Machine Learning and Cybernetics, 2018, 9 : 1707 - 1717
  • [3] On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion
    Sugiyama, Masashi
    Yamada, Makoto
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2012, E95D (10): : 2564 - 2567
  • [4] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [5] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] Learning with Hilbert-Schmidt independence criterion: A review and new perspectives
    Wang, Tinghua
    Dai, Xiaolu
    Liu, Yuze
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 234
  • [7] Two-Stage Fuzzy Multiple Kernel Learning Based on Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Lu, Jie
    Zhang, Guangquan
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2018, 26 (06) : 3703 - 3714
  • [8] Sequence Alignment with the Hilbert-Schmidt Independence Criterion
    Campbell, Jordan
    Lewis, J. P.
    Seol, Yeongho
    [J]. PROCEEDINGS CVMP 2018: THE 15TH ACM SIGGRAPH EUROPEAN CONFERENCE ON VISUAL MEDIA PRODUCTION, 2018,
  • [9] Sensitivity maps of the Hilbert-Schmidt independence criterion
    Perez-Suay, Adrian
    Camps-Valls, Gustau
    [J]. APPLIED SOFT COMPUTING, 2018, 70 : 1054 - 1063
  • [10] Nystrom M -Hilbert-Schmidt Independence Criterion
    Kalinke, Florian
    Szabo, Zoltan
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1005 - 1015