CCMI : Classifier based Conditional Mutual Information Estimation

被引:0
|
作者
Mukherjee, Sudipto [1 ]
Asnani, Himanshu [1 ]
Kannan, Sreeram [1 ]
机构
[1] Univ Washington, Elect & Comp Engn, Seattle, WA 98195 USA
基金
美国国家科学基金会;
关键词
NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference problems such as graphical models, causal learning, feature selection and time-series analysis. While k-nearest neighbor (INN) based estimators as well as kernel-based methods have been widely used for CMI estimation, they suffer severely from the curse of dimensionality. In this paper, we leverage advances in classifiers and generative models to design methods for CMI estimation. Specifically, we introduce an estimator for KL-Divergence based on the likelihood ratio by training a classifier to distinguish the observed joint distribution from the product distribution. We then show how to construct several CMI estimators using this basic divergence estimator by drawing ideas from conditional generative models. We demonstrate that the estimates from our proposed approaches do not degrade in performance with increasing dimension and obtain significant improvement over the widely used KSG estimator. Finally, as an application of accurate CMI estimation, we use our best estimator for conditional independence testing and achieve superior performance than the state-of-the-art tester on both simulated and real data-sets.
引用
收藏
页码:1083 / 1093
页数:11
相关论文
共 50 条
  • [31] Signal estimation based on mutual information maximization
    Rohde, G. K.
    Nichols, J.
    Bucholtz, F.
    Michalowicz, J. V.
    CONFERENCE RECORD OF THE FORTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1-5, 2007, : 597 - +
  • [32] Mutual information estimation based on Copula entropy
    Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian Liaoning 116023, China
    Kong Zhi Li Lun Yu Ying Yong, 2013, 7 (875-879):
  • [33] On conditional Sibson's α-Mutual Information
    Esposito, Amedeo Roberto
    Wu, Diyuan
    Gastpar, Michael
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 1796 - 1801
  • [34] CONDITIONAL MUTUAL INFORMATION NEURAL ESTIMATOR
    Molavipour, Sina
    Bassi, German
    Skoglund, Mikael
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 5025 - 5029
  • [35] Conditional mutual information and quantum steering
    Kaur, Eneet
    Wang, Xiaoting
    Wilde, Mark M.
    PHYSICAL REVIEW A, 2017, 96 (02)
  • [36] Applying a Mutual Information Theory Based Feature Selection Method to a Classifier
    Lee, Sun-Mi
    HEALTHCARE INFORMATICS RESEARCH, 2005, 11 (03) : 247 - 253
  • [37] Estimation and mutual information
    Duncan, Tyrone E.
    Pasik-Duncan, Bozenna
    PROCEEDINGS OF THE 46TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2007, : 2319 - 2322
  • [38] Information Bottleneck Analysis by a Conditional Mutual Information Bound
    Tezuka, Taro
    Namekawa, Shizuma
    ENTROPY, 2021, 23 (08)
  • [39] C-MI-GAN : Estimation of Conditional Mutual Information Using MinMax Formulation
    Mondal, Arnab Kumar
    Bhattacharjee, Arnab
    Mukherjee, Sudipto
    Prathosh, A. P.
    Kannan, Sreeram
    Asnani, Himanshu
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI 2020), 2020, 124 : 849 - 858
  • [40] Additive non-Gaussian noise channels:: Mutual information and conditional mean estimation
    Guo, DN
    Shamai, S
    Verdú, S
    2005 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), VOLS 1 AND 2, 2005, : 719 - 723