A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables

被引:7
|
作者
Marrelec, Guillaume [1 ]
Messe, Arnaud [2 ]
Bellec, Pierre [3 ]
机构
[1] UPMC Univ Paris 06, Univ Sorbonne, CNRS,INSERM, LIB, F-75013 Paris, France
[2] Univ Hamburg, Univ Med Ctr Hamburg Eppendorf, Dept Computat Neurosci, Hamburg, Germany
[3] Univ Montreal, Ctr Rech Inst Univ Geriatrie Montreal, Dept Informat & Rech Operationnelle, Montreal, PQ, Canada
来源
PLOS ONE | 2015年 / 10卷 / 09期
关键词
BRAIN; INTEGRATION; ORGANIZATION; DISTANCE; MATRICES;
D O I
10.1371/journal.pone.0137278
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Agglomerative hierarchical clustering of continuous variables based on mutual information
    Kojadinovic, I
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2004, 46 (02) : 269 - 294
  • [2] Hierarchical clustering using mutual information
    Kraskov, A
    Stögbauer, H
    Andrzejak, RG
    Grassberger, P
    [J]. EUROPHYSICS LETTERS, 2005, 70 (02): : 278 - 284
  • [3] A hierarchical clustering based on mutual information maximization
    Aghagolzadeh, M.
    Soltanian-Zadeh, H.
    Araabi, B.
    Aghagolzadeh, A.
    [J]. 2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 277 - +
  • [4] On the Mutual Information between Random Variables in Networks
    Xu, Xiaoli
    Thakor, Satyajit
    Guan, Yong Liang
    [J]. 2013 IEEE INFORMATION THEORY WORKSHOP (ITW), 2013,
  • [5] Efficient Estimation of Mutual Information for Strongly Dependent Variables
    Gao, Shuyang
    Ver Steeg, Greg
    Galstyan, Aram
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 277 - 286
  • [6] Sums of dependent Bernoulli random variables and disease clustering
    Yu, C
    Zelterman, D
    [J]. STATISTICS & PROBABILITY LETTERS, 2002, 57 (04) : 363 - 373
  • [7] Evaluation of Statistical Relationship of Random Variables via Mutual Information
    V. V. Tsurko
    A. I. Mikhalskii
    [J]. Automation and Remote Control, 2022, 83 : 734 - 742
  • [8] On mutual information estimation for mixed-pair random variables
    Beknazaryan, Aleksandr
    Dang, Xin
    Sang, Hailin
    [J]. STATISTICS & PROBABILITY LETTERS, 2019, 148 : 9 - 16
  • [9] Evaluation of Statistical Relationship of Random Variables via Mutual Information
    Tsurko, V. V.
    Mikhalskii, A., I
    [J]. AUTOMATION AND REMOTE CONTROL, 2022, 83 (05) : 734 - 742
  • [10] Clustering random variables
    Hathaway, RJ
    [J]. IETE JOURNAL OF RESEARCH, 1998, 44 (4-5) : 199 - 205