Robust Learning with the Hilbert-Schmidt Independence Criterion

被引:0
|
作者
Greenfeld, Daniel [1 ]
Shalit, Uri [1 ]
机构
[1] Technion Israel Inst Technol, Haifa, Israel
基金
以色列科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the use of a non-parametric independence measure, the Hilbert-Schmidt Independence Criterion (HSIC), as a loss-function for learning robust regression and classification models. This loss-function encourages learning models where the distribution of the residuals between the label and the model prediction is statistically independent of the distribution of the instances themselves. This loss-function was first proposed by Mooij et al. (2009) in the context of learning causal graphs. We adapt it to the task of learning for unsupervised covariate shift: learning on a source domain without access to any instances or labels from the unknown target domain, but with the assumption that p(y vertical bar x) (the conditional probability of labels given instances) remains the same in the target domain We show that the proposed loss is expected to give rise to models that generalize well on a class of target domains characterised by the complexity of their description within a reproducing kernel Hilbert space. Experiments on unsupervised covariate shift tasks demonstrate that models learned with the proposed loss-function outperform models learned with standard loss functions, achieving state-of-the-art results on a challenging cell-microscopy unsupervised covariate shift task.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] DIRECTION ESTIMATION IN SINGLE-INDEX REGRESSIONS VIA HILBERT-SCHMIDT INDEPENDENCE CRITERION
    Zhang, Nan
    Yin, Xiangrong
    [J]. STATISTICA SINICA, 2015, 25 (02) : 743 - 758
  • [32] Hilbert-Schmidt Independence Criterion Regularization Kernel Framework on Symmetric Positive Definite Manifolds
    Liu, Xi
    Zhan, Zengrong
    Niu, Guo
    [J]. Mathematical Problems in Engineering, 2021, 2021
  • [33] Hilbert-Schmidt Independence Criterion Regularization Kernel Framework on Symmetric Positive Definite Manifolds
    Liu, Xi
    Zhan, Zengrong
    Niu, Guo
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [34] Filter-based unsupervised feature selection using Hilbert-Schmidt independence criterion
    Liaghat, Samaneh
    Mansoori, Eghbal G.
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (09) : 2313 - 2328
  • [35] Identification of membrane protein types via multivariate information fusion with Hilbert-Schmidt Independence Criterion
    Wang, Hao
    Ding, Yijie
    Tang, Jijun
    Guo, Fei
    [J]. NEUROCOMPUTING, 2020, 383 : 257 - 269
  • [36] Fast and Scalable Feature Selection for Gene Expression Data Using Hilbert-Schmidt Independence Criterion
    Gangeh, Mehrdad J.
    Zarkoob, Hadi
    Ghodsi, Ali
    [J]. IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2017, 14 (01) : 167 - 181
  • [37] REFLEXIVITY CRITERION FOR CONTRACTIONS WITH HILBERT-SCHMIDT DEFECT OPERATOR
    KAPUSTIN, VV
    [J]. DOKLADY AKADEMII NAUK SSSR, 1991, 318 (06): : 1308 - 1312
  • [38] Hilbert-Schmidt Independence Criterion Lasso Feature Selection in Parkinson's Disease Detection System
    Wiharto, Wiharto
    Sucipto, Ahmad
    Salamah, Umi
    [J]. INTERNATIONAL JOURNAL OF FUZZY LOGIC AND INTELLIGENT SYSTEMS, 2023, 23 (04) : 482 - 499
  • [39] High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
    Chen, Xin
    Deng, Chang
    He, Shuaida
    Wu, Runxiong
    Zhang, Jia
    [J]. STATISTICS AND COMPUTING, 2024, 34 (02)
  • [40] Automatic Network Pruning via Hilbert-Schmidt Independence Criterion Lasso under Information Bottleneck Principle
    Guo, Song
    Zhang, Lei
    Zheng, Xiawu
    Wang, Yan
    Li, Yuchao
    Chao, Fei
    Wu, Chenglin
    Zhang, Shengchuan
    Ji, Rongrong
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17412 - 17423