On the Interventional Kullback-Leibler Divergence

被引:0
|
作者
Wildberger, Jonas [1 ]
Guo, Siyuan [1 ,2 ]
Bhattacharyya, Arnab [3 ]
Schoelkopf, Bernhard [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Max Planck Ring 4, D-72076 Tubingen, Germany
[2] Univ Cambridge, Cambridge, England
[3] Natl Univ Singapore, Sch Comp, Singapore, Singapore
关键词
causal distance; causal discovery; multi-environment learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern machine learning approaches excel in static settings where a large amount of i.i.d. training data are available for a given task. In a dynamic environment, though, an intelligent agent needs to be able to transfer knowledge and re-use learned components across domains. It has been argued that this may be possible through causal models, aiming to mirror the modularity of the real world in terms of independent causal mechanisms. However, the true causal structure underlying a given set of data is generally not identifiable, so it is desirable to have means to quantify differences between models (e.g., between the ground truth and an estimate), on both the observational and interventional level. In the present work, we introduce the Interventional Kullback-Leibler (IKL) divergence to quantify both structural and distributional differences between models based on a finite set of multienvironment distributions generated by interventions from the ground truth. Since we generally cannot quantify all differences between causal models for every finite set of interventional distributions, we propose a sufficient condition on the intervention targets to identify subsets of observed variables on which the models provably agree or disagree.
引用
收藏
页码:328 / 349
页数:22
相关论文
共 50 条
  • [41] Probabilistic Forecast Reconciliation with Kullback-Leibler Divergence Regularization
    Zhang, Guanyu
    Li, Feng
    Kang, Yanfei
    [J]. 2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 601 - 607
  • [42] Ellipticity and Circularity Measuring via Kullback-Leibler Divergence
    Misztal, Krzysztof
    Tabor, Jacek
    [J]. JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2016, 55 (01) : 136 - 150
  • [43] Detection of Test Collusion via Kullback-Leibler Divergence
    Belov, Dmitry I.
    [J]. JOURNAL OF EDUCATIONAL MEASUREMENT, 2013, 50 (02) : 141 - 163
  • [44] Trigonometric Moment Matching and Minimization of the Kullback-Leibler Divergence
    Kurz, Gerhard
    Hanebeck, Uwe D.
    [J]. IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2015, 51 (04) : 3480 - 3484
  • [45] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    [J]. JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [46] Kullback-Leibler divergence as an estimate of reproducibility of numerical results
    Calvayrac, Florent
    [J]. 2015 7TH INTERNATIONAL CONFERENCE ON NEW TECHNOLOGIES, MOBILITY AND SECURITY (NTMS), 2015,
  • [47] PARAMETER ESTIMATION BASED ON CUMULATIVE KULLBACK-LEIBLER DIVERGENCE
    Mehrali, Yaser
    Asadi, Majid
    [J]. REVSTAT-STATISTICAL JOURNAL, 2021, 19 (01) : 111 - 130
  • [48] The Kullback-Leibler Divergence evolution of randomized electromagnetic field
    Shi, Tian
    Li, Yunzhou
    Shi, Qingfan
    Li, Liangsheng
    Zheng, Ning
    [J]. OPTICS COMMUNICATIONS, 2021, 478
  • [49] On the Kullback-Leibler information divergence of locally stationary processes
    Dahlhaus, R
    [J]. STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1996, 62 (01) : 139 - 168
  • [50] Algorithms for Nonnegative Matrix Factorization with the Kullback-Leibler Divergence
    Hien, Le Thi Khanh
    Gillis, Nicolas
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2021, 87 (03)