Kernel learning and optimization with Hilbert–Schmidt independence criterion

被引:0
|
作者
Tinghua Wang
Wei Li
机构
[1] Gannan Normal University,School of Mathematics and Computer Science
[2] University of Technology Sydney,Decision Systems and e
关键词
Kernel method; Kernel learning; Hilbert–Schmidt independence criterion (HSIC); Statistical dependence; Gaussian kernel optimization; Classification;
D O I
暂无
中图分类号
学科分类号
摘要
Measures of statistical dependence between random variables have been successfully applied in many machine learning tasks, such as independent component analysis, feature selection, clustering and dimensionality reduction. The success is based on the fact that many existing learning tasks can be cast into problems of dependence maximization (or minimization). Motivated by this, we present a unifying view of kernel learning via statistical dependence estimation. The key idea is that good kernels should maximize the statistical dependence between the kernels and the class labels. The dependence is measured by the Hilbert–Schmidt independence criterion (HSIC), which is based on computing the Hilbert–Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces and is traditionally used to measure the statistical dependence between random variables. As a special case of kernel learning, we propose a Gaussian kernel optimization method for classification by maximizing the HSIC, where two forms of Gaussian kernels (spherical kernel and ellipsoidal kernel) are considered. Extensive experiments on real-world data sets from UCI benchmark repository validate the superiority of the proposed approach in terms of both prediction accuracy and computational efficiency.
引用
收藏
页码:1707 / 1717
页数:10
相关论文
共 50 条
  • [1] Kernel learning and optimization with Hilbert-Schmidt independence criterion
    Wang, Tinghua
    Li, Wei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (10) : 1707 - 1717
  • [2] Kernel Learning with Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Li, Wei
    He, Xianwen
    [J]. PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 720 - 730
  • [3] On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion
    Sugiyama, Masashi
    Yamada, Makoto
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2012, E95D (10): : 2564 - 2567
  • [4] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [5] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] Two-Stage Fuzzy Multiple Kernel Learning Based on Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Lu, Jie
    Zhang, Guangquan
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2018, 26 (06) : 3703 - 3714
  • [7] Learning with Hilbert-Schmidt independence criterion: A review and new perspectives
    Wang, Tinghua
    Dai, Xiaolu
    Liu, Yuze
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 234
  • [8] Hilbert-Schmidt Independence Criterion Regularization Kernel Framework on Symmetric Positive Definite Manifolds
    Liu, Xi
    Zhan, Zengrong
    Niu, Guo
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [9] Hilbert-Schmidt Independence Criterion Regularization Kernel Framework on Symmetric Positive Definite Manifolds
    Liu, Xi
    Zhan, Zengrong
    Niu, Guo
    [J]. Mathematical Problems in Engineering, 2021, 2021
  • [10] Sequence Alignment with the Hilbert-Schmidt Independence Criterion
    Campbell, Jordan
    Lewis, J. P.
    Seol, Yeongho
    [J]. PROCEEDINGS CVMP 2018: THE 15TH ACM SIGGRAPH EUROPEAN CONFERENCE ON VISUAL MEDIA PRODUCTION, 2018,