On the Interventional Kullback-Leibler Divergence

被引:0
|
作者
Wildberger, Jonas [1 ]
Guo, Siyuan [1 ,2 ]
Bhattacharyya, Arnab [3 ]
Schoelkopf, Bernhard [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Max Planck Ring 4, D-72076 Tubingen, Germany
[2] Univ Cambridge, Cambridge, England
[3] Natl Univ Singapore, Sch Comp, Singapore, Singapore
关键词
causal distance; causal discovery; multi-environment learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern machine learning approaches excel in static settings where a large amount of i.i.d. training data are available for a given task. In a dynamic environment, though, an intelligent agent needs to be able to transfer knowledge and re-use learned components across domains. It has been argued that this may be possible through causal models, aiming to mirror the modularity of the real world in terms of independent causal mechanisms. However, the true causal structure underlying a given set of data is generally not identifiable, so it is desirable to have means to quantify differences between models (e.g., between the ground truth and an estimate), on both the observational and interventional level. In the present work, we introduce the Interventional Kullback-Leibler (IKL) divergence to quantify both structural and distributional differences between models based on a finite set of multienvironment distributions generated by interventions from the ground truth. Since we generally cannot quantify all differences between causal models for every finite set of interventional distributions, we propose a sufficient condition on the intervention targets to identify subsets of observed variables on which the models provably agree or disagree.
引用
收藏
页码:328 / 349
页数:22
相关论文
共 50 条
  • [21] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [22] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [23] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [24] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [25] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [26] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [27] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [28] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [29] On calibration of Kullback-Leibler divergence via prediction
    Keyes, TK
    Levy, MS
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (01) : 67 - 85
  • [30] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340