Collapsed Variational Dirichlet Process Mixture Models

被引:0
|
作者
Kurihara, Kenichi [1 ]
Welling, Max [2 ]
Teh, Yee Whye [3 ]
机构
[1] Tokyo Inst Technol, Dept Comp Sci, Tokyo, Japan
[2] UC Irvine, Dept Comp Sci, Irvine, CA USA
[3] Natl Univ Singapore, Dept Comp Sci, Singapore, Singapore
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonparametric Bayesian mixture models, in particular Dirichlet process (DP) mixture models, have shown great promise for density estimation and data clustering. Given the size of today's datasets, computational efficiency becomes an essential ingredient in the applicability of these techniques to real world data. We study and experimentally compare a number of variational Bayesian (VB) approximations to the DP mixture model. In particular we consider the standard VB approximation where parameters are assumed to be independent from cluster assignment variables, and a novel collapsed VB approximation where mixture weights are marginalized out. For both VB approximations we consider two different ways to approximate the DP, by truncating the stick-breaking construction, and by using a finite mixture model with a symmetric Dirichlet prior.
引用
收藏
页码:2796 / 2801
页数:6
相关论文
共 50 条
  • [1] Partially collapsed parallel Gibbs sampler for Dirichlet process mixture models
    Yerebakan, Halid Ziya
    Dundar, Murat
    [J]. PATTERN RECOGNITION LETTERS, 2017, 90 : 22 - 27
  • [2] Distributed Collapsed Gibbs Sampler for Dirichlet Process Mixture Models in Federated Learning
    Khoufache, Reda
    Lebbah, Mustapha
    Azzag, Hanene
    Goffinet, Etienne
    Bouchaffra, Djamel
    [J]. PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 815 - 823
  • [3] Online Sparse Collapsed Hybrid Variational-Gibbs Algorithm for Hierarchical Dirichlet Process Topic Models
    Burkhardt, Sophie
    Kramer, Stefan
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2017, PT II, 2017, 10535 : 189 - 204
  • [4] Estimating mixture of Dirichlet process models
    MacEachern, SN
    Muller, P
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 1998, 7 (02) : 223 - 238
  • [5] Online variational learning of finite Dirichlet mixture models
    Fan, Wentao
    Bouguila, Nizar
    [J]. EVOLVING SYSTEMS, 2012, 3 (03) : 153 - 165
  • [6] Online variational learning of finite Dirichlet mixture models
    Wentao Fan
    Nizar Bouguila
    [J]. Evolving Systems, 2012, 3 (3) : 153 - 165
  • [7] Variational Learning for Finite Dirichlet Mixture Models and Applications
    Fan, Wentao
    Bouguila, Nizar
    Ziou, Djemel
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (05) : 762 - 774
  • [8] An improved collapsed Gibbs sampler for Dirichlet process mixing models
    Kuo, L
    Yang, TY
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2006, 50 (03) : 659 - 674
  • [9] Online Variational Learning for a Dirichlet Process Mixture of Dirichlet Distributions and Its Application
    Fan, Wentao
    Bouguila, Nizar
    [J]. 2012 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2012), VOL 1, 2012, : 362 - 367
  • [10] Mixture Probabilistic PCA for Process Monitoring - Collapsed Variational Bayesian Approach
    Raveendran, Rahul
    Huang, Biao
    [J]. IFAC PAPERSONLINE, 2016, 49 (07): : 1032 - 1037