Estimating individual treatment effect: generalization bounds and algorithms

被引:0
|
作者
Shalit, Uri [1 ]
Johansson, Fredrik D. [2 ]
Sontag, David [2 ,3 ]
机构
[1] NYU, CIMS, 550 1St Ave, New York, NY 10003 USA
[2] MIT, IMES, Cambridge, MA 02142 USA
[3] MIT, CSAIL, 77 Massachusetts Ave, Cambridge, MA 02139 USA
基金
美国国家科学基金会;
关键词
CAUSAL INFERENCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There is intense interest in applying machine learning to problems of causal inference in fields such as healthcare, economics and education. In particular, individual-level causal inference has important applications such as precision medicine. We give a new theoretical analysis and family of algorithms for predicting individual treatment effect (ITE) from observational data, under the assumption known as strong ignorability. The algorithms learn a "balanced" representation such that the induced treated and control distributions look similar, and we give a novel and intuitive generalization-error bound showing the expected ITE estimation error of a representation is bounded by a sum of the standard generalization-error of that representation and the distance between the treated and control distributions induced by the representation. We use Integral Probability Metrics to measure distances between distributions, deriving explicit bounds for the Wasserstein and Maximum Mean Discrepancy (MMD) distances. Experiments on real and simulated data show the new algorithms match or outperform the state-of-the-art.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Propositional lower bounds: Generalization and algorithms
    Cadoli, M
    Palopoli, L
    Scarcello, F
    LOGICS IN ARTIFICIAL INTELLIGENCE, 1998, 1489 : 355 - 367
  • [2] Generalization Bounds for Uniformly Stable Algorithms
    Feldman, Vitaly
    Vondrak, Jan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Generalization Bounds for Some Ordinal Regression Algorithms
    Agarwal, Shivani
    ALGORITHMIC LEARNING THEORY, PROCEEDINGS, 2008, 5254 : 7 - 21
  • [4] Generalization Error Bounds for Noisy, Iterative Algorithms
    Pensia, Ankit
    Jog, Varun
    Loh, Po-Ling
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 546 - 550
  • [5] Infinite Kernel Learning: Generalization Bounds and Algorithms
    Liu, Yong
    Liao, Shizhong
    Lin, Hailun
    Yue, Yinliang
    Wang, Weiping
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2280 - 2286
  • [6] Generalization error bounds for Bayesian mixture algorithms
    Meir, R
    Zhang, T
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (05) : 839 - 860
  • [7] Generalization Bounds for Estimating Causal Effects of Continuous Treatments
    Wang, Xin
    Lyu, Shengfei
    Wu, Xingyu
    Wu, Tianhao
    Chen, Huanhuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [8] Generalization Bounds for Ranking Algorithms via Algorithmic Stability
    Agarwal, Shivani
    Niyogi, Partha
    JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 441 - 474
  • [9] Learning Additive Noise Channels: Generalization Bounds and Algorithms
    Weinberger, Nir
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2586 - 2591
  • [10] Generalization Error Bounds for Optimization Algorithms via Stability
    Meng, Qi
    Wang, Yue
    Chen, Wei
    Wang, Taifeng
    Ma, Zhi-Ming
    Liu, Tie-Yan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2336 - 2342