Robust Learning with the Hilbert-Schmidt Independence Criterion

被引:0
|
作者
Greenfeld, Daniel [1 ]
Shalit, Uri [1 ]
机构
[1] Technion Israel Inst Technol, Haifa, Israel
基金
以色列科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the use of a non-parametric independence measure, the Hilbert-Schmidt Independence Criterion (HSIC), as a loss-function for learning robust regression and classification models. This loss-function encourages learning models where the distribution of the residuals between the label and the model prediction is statistically independent of the distribution of the instances themselves. This loss-function was first proposed by Mooij et al. (2009) in the context of learning causal graphs. We adapt it to the task of learning for unsupervised covariate shift: learning on a source domain without access to any instances or labels from the unknown target domain, but with the assumption that p(y vertical bar x) (the conditional probability of labels given instances) remains the same in the target domain We show that the proposed loss is expected to give rise to models that generalize well on a class of target domains characterised by the complexity of their description within a reproducing kernel Hilbert space. Experiments on unsupervised covariate shift tasks demonstrate that models learned with the proposed loss-function outperform models learned with standard loss functions, achieving state-of-the-art results on a challenging cell-microscopy unsupervised covariate shift task.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [2] Kernel Learning with Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Li, Wei
    He, Xianwen
    [J]. PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 720 - 730
  • [3] Kernel learning and optimization with Hilbert-Schmidt independence criterion
    Wang, Tinghua
    Li, Wei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (10) : 1707 - 1717
  • [4] Learning with Hilbert-Schmidt independence criterion: A review and new perspectives
    Wang, Tinghua
    Dai, Xiaolu
    Liu, Yuze
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 234
  • [5] Sequence Alignment with the Hilbert-Schmidt Independence Criterion
    Campbell, Jordan
    Lewis, J. P.
    Seol, Yeongho
    [J]. PROCEEDINGS CVMP 2018: THE 15TH ACM SIGGRAPH EUROPEAN CONFERENCE ON VISUAL MEDIA PRODUCTION, 2018,
  • [6] Sensitivity maps of the Hilbert-Schmidt independence criterion
    Perez-Suay, Adrian
    Camps-Valls, Gustau
    [J]. APPLIED SOFT COMPUTING, 2018, 70 : 1054 - 1063
  • [7] Nystrom M -Hilbert-Schmidt Independence Criterion
    Kalinke, Florian
    Szabo, Zoltan
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1005 - 1015
  • [8] Sparse Hilbert-Schmidt Independence Criterion Regression
    Poignard, Benjamin
    Yamada, Makoto
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 538 - 547
  • [9] Extending Hilbert-Schmidt Independence Criterion for Testing Conditional Independence
    Zhang, Bingyuan
    Suzuki, Joe
    [J]. ENTROPY, 2023, 25 (03)
  • [10] On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion
    Sugiyama, Masashi
    Yamada, Makoto
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2012, E95D (10): : 2564 - 2567