Learning Causal Effects via Weighted Empirical Risk Minimization

被引:0
|
作者
Jung, Yonghan [1 ]
Tian, Jin [2 ]
Bareinboim, Elias [3 ]
机构
[1] Purdue Univ Univ, Dept Comp Sci, W Lafayette, IN 47907 USA
[2] Iowa State Univ, Dept Comp Sci, Ames, IA 50011 USA
[3] Columbia Univ, Dept Comp Sci, New York, NY 10027 USA
关键词
INFERENCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning causal effects from data is a fundamental problem across the sciences. Determining the identifiability of a target effect from a combination of the observational distribution and the causal graph underlying a phenomenon is well-understood in theory. However, in practice, it remains a challenge to apply the identification theory to estimate the identified causal functionals from finite samples. Although a plethora of effective estimators have been developed under the setting known as the back-door (also called conditional ignorability), there exists still no systematic way of estimating arbitrary causal functionals that are both computationally and statistically attractive. This paper aims to bridge this gap, from causal identification to causal estimation. We note that estimating functionals from limited samples based on the empirical risk minimization (ERM) principle has been pervasive in the machine learning literature, and these methods have been extended to causal inference under the back-door setting. In this paper, we develop a learning framework that marries two families of methods, benefiting from the generality of the causal identification theory and the effectiveness of the estimators produced based on the principle of ERM. Specifically, we develop a sound and complete algorithm that generates causal functionals in the form of weighted distributions that are amenable to the ERM optimization. We then provide a practical procedure for learning causal effects from finite samples and a causal graph. Finally, experimental results support the effectiveness of our approach.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Minimization of Empirical Risk as a Means of Choosing the Number of Hypotheses in Algebraic Machine Learning
    D. V. Vinogradov
    Pattern Recognition and Image Analysis, 2023, 33 : 525 - 528
  • [32] Evaluation of Causal Structure Learning Algorithms via Risk Estimation
    Eigenmann, Marco F.
    Mukherjee, Sach
    Maathuis, Marloes H.
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI 2020), 2020, 124 : 151 - 160
  • [33] Local complexities for empirical risk minimization
    Bartlett, PL
    Mendelson, S
    Philips, P
    LEARNING THEORY, PROCEEDINGS, 2004, 3120 : 270 - 284
  • [34] EMPIRICAL RISK MINIMIZATION IN INVERSE PROBLEMS
    Klemela, Jussi
    Mammen, Enno
    ANNALS OF STATISTICS, 2010, 38 (01): : 482 - 511
  • [35] Robust Empirical Risk Minimization with Tolerance
    Bhattacharjee, Robi
    Hopkins, Max
    Kumar, Akash
    Yu, Hantao
    Chaudhuri, Kamalika
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 182 - 203
  • [36] Differentially Private Empirical Risk Minimization
    Chaudhuri, Kamalika
    Monteleoni, Claire
    Sarwate, Anand D.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 1069 - 1109
  • [37] On Concentration for (Regularized) Empirical Risk Minimization
    van de Geer S.
    Wainwright M.J.
    Sankhya A, 2017, 79 (2): : 159 - 200
  • [38] Distributed Personalized Empirical Risk Minimization
    Deng, Yuyang
    Kamani, Mohammad Mahdi
    Mahdavinia, Pouria
    Mahdavi, Mehrdad
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [39] Fair Empirical Risk Minimization Revised
    Franco, Danilo
    Oneto, Luca
    Anguita, Davide
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2023, PT I, 2023, 14134 : 29 - 42
  • [40] General Fair Empirical Risk Minimization
    Oneto, Luca
    Donini, Michele
    Pontil, Massimiliano
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,