CCMI : Classifier based Conditional Mutual Information Estimation

被引:0
|
作者
Mukherjee, Sudipto [1 ]
Asnani, Himanshu [1 ]
Kannan, Sreeram [1 ]
机构
[1] Univ Washington, Elect & Comp Engn, Seattle, WA 98195 USA
基金
美国国家科学基金会;
关键词
NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference problems such as graphical models, causal learning, feature selection and time-series analysis. While k-nearest neighbor (INN) based estimators as well as kernel-based methods have been widely used for CMI estimation, they suffer severely from the curse of dimensionality. In this paper, we leverage advances in classifiers and generative models to design methods for CMI estimation. Specifically, we introduce an estimator for KL-Divergence based on the likelihood ratio by training a classifier to distinguish the observed joint distribution from the product distribution. We then show how to construct several CMI estimators using this basic divergence estimator by drawing ideas from conditional generative models. We demonstrate that the estimates from our proposed approaches do not degrade in performance with increasing dimension and obtain significant improvement over the widely used KSG estimator. Finally, as an application of accurate CMI estimation, we use our best estimator for conditional independence testing and achieve superior performance than the state-of-the-art tester on both simulated and real data-sets.
引用
收藏
页码:1083 / 1093
页数:11
相关论文
共 50 条
  • [21] A New Singly Connected Network Classifier based on Mutual Information
    Thomas, Clifford S.
    Howie, Catherine A.
    Smith, Leslie S.
    INTELLIGENT DATA ANALYSIS, 2005, 9 (02) : 189 - 205
  • [22] A Fair Classifier Using Mutual Information
    Cho, Jaewoong
    Hwang, Gyeongjo
    Suh, Changho
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2521 - 2526
  • [23] Conditional mutual information based feature selection for classification task
    Novovicova, Jana
    Somol, Petr
    Haindl, Michal
    Pudil, Pavel
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS AND APPLICATIONS, PROCEEDINGS, 2007, 4756 : 417 - 426
  • [24] FEATURE SELECTION ALGORITHM BASED ON CONDITIONAL DYNAMIC MUTUAL INFORMATION
    Wang Liping
    INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS, 2015, 8 (01): : 316 - 337
  • [25] Time-efficient estimation of conditional mutual information for variable selection in classification
    Todorov, Diman
    Setchi, Rossi
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 72 : 105 - 127
  • [26] Markov chain order estimation with parametric significance tests of conditional mutual information
    Papapetrou, Maria
    Kugiumtzis, Dimitris
    SIMULATION MODELLING PRACTICE AND THEORY, 2016, 61 : 1 - 13
  • [27] Reconstructing Gene Regulation Network based on Conditional Mutual Information
    Yang, Bei
    Xu, Yaohui
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON MECHANICAL, ELECTRONIC, CONTROL AND AUTOMATION ENGINEERING (MECAE 2017), 2017, 61 : 39 - 42
  • [28] CONDITIONAL DYNAMIC MUTUAL INFORMATION-BASED FEATURE SELECTION
    Liu, Huawen
    Mo, Yuchang
    Zhao, Jianmin
    COMPUTING AND INFORMATICS, 2012, 31 (06) : 1193 - 1216
  • [29] Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information
    Runge, Jakob
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [30] Density estimation based on pointwise mutual information
    Inoue, Akimitsu
    ECONOMICS BULLETIN, 2016, 36 (02): : 1138 - +