CCMI : Classifier based Conditional Mutual Information Estimation

被引:0
|
作者
Mukherjee, Sudipto [1 ]
Asnani, Himanshu [1 ]
Kannan, Sreeram [1 ]
机构
[1] Univ Washington, Elect & Comp Engn, Seattle, WA 98195 USA
基金
美国国家科学基金会;
关键词
NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference problems such as graphical models, causal learning, feature selection and time-series analysis. While k-nearest neighbor (INN) based estimators as well as kernel-based methods have been widely used for CMI estimation, they suffer severely from the curse of dimensionality. In this paper, we leverage advances in classifiers and generative models to design methods for CMI estimation. Specifically, we introduce an estimator for KL-Divergence based on the likelihood ratio by training a classifier to distinguish the observed joint distribution from the product distribution. We then show how to construct several CMI estimators using this basic divergence estimator by drawing ideas from conditional generative models. We demonstrate that the estimates from our proposed approaches do not degrade in performance with increasing dimension and obtain significant improvement over the widely used KSG estimator. Finally, as an application of accurate CMI estimation, we use our best estimator for conditional independence testing and achieve superior performance than the state-of-the-art tester on both simulated and real data-sets.
引用
收藏
页码:1083 / 1093
页数:11
相关论文
共 50 条
  • [1] A double layer Bayesian classifier using conditional mutual information
    Sun, Jiangwen
    Wang, Chongjun
    Chen, Shifu
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2007, 14 : 213 - 217
  • [2] Adaptive Label Smoothing for Classifier-based Mutual Information Neural Estimation
    Wang, Xu
    Al-Bashabsheh, Ali
    Zhao, Chao
    Chan, Chung
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 1035 - 1040
  • [3] Markov chain order estimation with conditional mutual information
    Papapetrou, M.
    Kugiumtzis, D.
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2013, 392 (07) : 1593 - 1601
  • [4] Mutual information and conditional mean estimation in Poisson channels
    Guo, Dongning
    Shamai, Shlomo
    Verdu, Sergio
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2008, 54 (05) : 1837 - 1849
  • [5] Estimation of Mutual Information and Conditional Entropy for Surveillance Optimization
    Le, Duc H.
    Reynolds, Albert C.
    SPE JOURNAL, 2014, 19 (04): : 648 - 661
  • [6] Mutual information and conditional mean estimation in Poisson channels
    Guo, DN
    Verdú, S
    Shamai, S
    2004 IEEE INFORMATION THEORY WORKSHOP, PROCEEDINGS, 2004, : 265 - 270
  • [7] Conditional Mutual Information based Feature Selection
    Cheng, Hongrong
    Qin, Zhiguang
    Qian, Weizhong
    Liu, Wei
    KAM: 2008 INTERNATIONAL SYMPOSIUM ON KNOWLEDGE ACQUISITION AND MODELING, PROCEEDINGS, 2008, : 103 - 107
  • [8] Conditional Mutual Information Estimation for Mixed, Discrete and Continuous Data
    Mesner, Octavio Cesar
    Shalizi, Cosma Rohilla
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (01) : 464 - 484
  • [9] A Novel Mutual Information based Ant Colony Classifier
    Yu, Hang
    Qian, Xiaoxiao
    Yu, Yang
    Cheng, Jiujun
    Yu, Ying
    Gao, Shangce
    PROCEEDINGS OF 2017 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC 2017), 2017, : 61 - 65
  • [10] Feature selection based on weighted conditional mutual information
    Zhou, Hongfang
    Wang, Xiqian
    Zhang, Yao
    APPLIED COMPUTING AND INFORMATICS, 2024, 20 (1/2) : 55 - 68