Learning Causal Effects via Weighted Empirical Risk Minimization

被引:0
|
作者
Jung, Yonghan [1 ]
Tian, Jin [2 ]
Bareinboim, Elias [3 ]
机构
[1] Purdue Univ Univ, Dept Comp Sci, W Lafayette, IN 47907 USA
[2] Iowa State Univ, Dept Comp Sci, Ames, IA 50011 USA
[3] Columbia Univ, Dept Comp Sci, New York, NY 10027 USA
关键词
INFERENCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning causal effects from data is a fundamental problem across the sciences. Determining the identifiability of a target effect from a combination of the observational distribution and the causal graph underlying a phenomenon is well-understood in theory. However, in practice, it remains a challenge to apply the identification theory to estimate the identified causal functionals from finite samples. Although a plethora of effective estimators have been developed under the setting known as the back-door (also called conditional ignorability), there exists still no systematic way of estimating arbitrary causal functionals that are both computationally and statistically attractive. This paper aims to bridge this gap, from causal identification to causal estimation. We note that estimating functionals from limited samples based on the empirical risk minimization (ERM) principle has been pervasive in the machine learning literature, and these methods have been extended to causal inference under the back-door setting. In this paper, we develop a learning framework that marries two families of methods, benefiting from the generality of the causal identification theory and the effectiveness of the estimators produced based on the principle of ERM. Specifically, we develop a sound and complete algorithm that generates causal functionals in the form of weighted distributions that are amenable to the ERM optimization. We then provide a practical procedure for learning causal effects from finite samples and a causal graph. Finally, experimental results support the effectiveness of our approach.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] ERMPD: causal intervention for popularity debiasing in recommendation via empirical risk minimization
    Ming He
    Hao Wen
    Xinlei Hu
    Boyang An
    CCF Transactions on Pervasive Computing and Interaction, 2024, 6 : 36 - 51
  • [2] ERMPD: causal intervention for popularity debiasing in recommendation via empirical risk minimization
    He, Ming
    Wen, Hao
    Hu, Xinlei
    An, Boyang
    CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION, 2024, 6 (01) : 36 - 51
  • [3] Resistant Neural Network Learning via Resistant Empirical Risk Minimization
    Shibzukhov, Zaur M.
    ADVANCES IN NEURAL NETWORKS - ISNN 2019, PT I, 2019, 11554 : 340 - 350
  • [4] Aggregation via empirical risk minimization
    Guillaume Lecué
    Shahar Mendelson
    Probability Theory and Related Fields, 2009, 145 : 591 - 613
  • [5] Aggregation via empirical risk minimization
    Lecue, Guillaume
    Mendelson, Shahar
    PROBABILITY THEORY AND RELATED FIELDS, 2009, 145 (3-4) : 591 - 613
  • [6] Risk-sensitive learning via minimization of empirical conditional value-at-risk
    Kashima, Hisashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2007, E90D (12) : 2043 - 2052
  • [7] On Graph Reconstruction via Empirical Risk Minimization: Fast Learning Rates and Scalability
    Papa, Guillaume
    Clemencon, Stephan
    Bellet, Aurelien
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [8] NONPARAMETRIC-ESTIMATION VIA EMPIRICAL RISK MINIMIZATION
    LUGOSI, G
    ZEGER, K
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1995, 41 (03) : 677 - 687
  • [9] Gradient Learning under Tilted Empirical Risk Minimization
    Liu, Liyuan
    Song, Biqin
    Pan, Zhibin
    Yang, Chuanwu
    Xiao, Chi
    Li, Weifu
    ENTROPY, 2022, 24 (07)
  • [10] SoftClusterMix: learning soft boundaries for empirical risk minimization
    Corneliu Florea
    Constantin Vertan
    Laura Florea
    Neural Computing and Applications, 2023, 35 : 12039 - 12053