Extending Hilbert-Schmidt Independence Criterion for Testing Conditional Independence

被引:0
|
作者
Zhang, Bingyuan [1 ]
Suzuki, Joe [1 ]
机构
[1] Osaka Univ, Grad Sch Engineer Sci, Toyonaka 5600043, Japan
基金
日本科学技术振兴机构;
关键词
conditional independence test; dependence measure; local bootstrap;
D O I
10.3390/e25030425
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The Conditional Independence (CI) test is a fundamental problem in statistics. Many nonparametric CI tests have been developed, but a common challenge exists: the current methods perform poorly with a high-dimensional conditioning set. In this paper, we considered a nonparametric CI test using a kernel-based test statistic, which can be viewed as an extension of the Hilbert-Schmidt Independence Criterion (HSIC). We propose a local bootstrap method to generate samples from the null distribution H-0:X?Y divide Z. The experimental results showed that our proposed method led to a significant performance improvement compared with previous methods. In particular, our method performed well against the growth of the dimension of the conditioning set. Meanwhile, our method can be computed efficiently against the growth of the sample size and the dimension of the conditioning set.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Test of conditional independence in factor models via Hilbert-Schmidt independence criterion
    Xu, Kai
    Cheng, Qing
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 199
  • [2] Testing independence of functional variables by an Hilbert-Schmidt independence criterion estimator
    Djonguet, Terence Kevin Manfoumbi
    Mbina, Alban Mbina
    Nkiet, Guy Martial
    [J]. STATISTICS & PROBABILITY LETTERS, 2024, 207
  • [3] Sequence Alignment with the Hilbert-Schmidt Independence Criterion
    Campbell, Jordan
    Lewis, J. P.
    Seol, Yeongho
    [J]. PROCEEDINGS CVMP 2018: THE 15TH ACM SIGGRAPH EUROPEAN CONFERENCE ON VISUAL MEDIA PRODUCTION, 2018,
  • [4] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [5] Sensitivity maps of the Hilbert-Schmidt independence criterion
    Perez-Suay, Adrian
    Camps-Valls, Gustau
    [J]. APPLIED SOFT COMPUTING, 2018, 70 : 1054 - 1063
  • [6] Nystrom M -Hilbert-Schmidt Independence Criterion
    Kalinke, Florian
    Szabo, Zoltan
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1005 - 1015
  • [7] Sparse Hilbert-Schmidt Independence Criterion Regression
    Poignard, Benjamin
    Yamada, Makoto
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 538 - 547
  • [8] Kernel Learning with Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Li, Wei
    He, Xianwen
    [J]. PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 720 - 730
  • [9] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [10] On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion
    Sugiyama, Masashi
    Yamada, Makoto
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2012, E95D (10): : 2564 - 2567