Distributed Inference for Dirichlet Process Mixture Models

被引:0
|
作者
Ge, Hong [1 ]
Chen, Yutian [1 ]
Wan, Moquan [1 ]
Ghahramani, Zoubin [1 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
基金
英国惠康基金; 英国工程与自然科学研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian nonparametric mixture models based on the Dirichlet process (DP) have been widely used for solving problems like clustering, density estimation and topic modelling. These models make weak assumptions about the underlying process that generated the observed data. Thus, when more data are collected, the complexity of these models can change accordingly. These theoretical properties often lead to superior predictive performance when compared to traditional finite mixture models. However, despite the increasing amount of data available, the application of Bayesian nonparametric mixture models is so far limited to relatively small data sets. In this paper, we propose an efficient distributed inference algorithm for the DP and the HDP mixture model. The proposed method is based on a variant of the slice sampler for DPs. Since this sampler does not involve a pre-determined truncation, the stationary distribution of the sampling algorithm is unbiased. We provide both local thread-level and distributed machine-level parallel implementations and study the performance of this sampler through an extensive set of experiments on image and text data. When compared to existing inference algorithms, the proposed method exhibits state-of-the-art accuracy and strong scalability with up to 512 cores.
引用
收藏
页码:2276 / 2284
页数:9
相关论文
共 50 条
  • [1] Distributed MCMC Inference in Dirichlet Process Mixture Models Using Julia
    Dinari, Or
    Yu, Angel
    Freifeld, Oren
    Fisher, John W., III
    [J]. 2019 19TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND GRID COMPUTING (CCGRID), 2019, : 518 - 525
  • [2] Fast Bayesian Inference in Dirichlet Process Mixture Models
    Wang, Lianming
    Dunson, David B.
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2011, 20 (01) : 196 - 216
  • [3] Scalable Estimation of Dirichlet Process Mixture Models on Distributed Data
    Wang, Ruohui
    Lin, Dahua
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4632 - 4639
  • [4] Performance Comparison of Julia Distributed Implementations of Dirichlet Process Mixture Models
    Huang, Ruizhu
    Xu, Weijia
    Wang, Yinzhi
    Liverani, Silvia
    Stapleton, Ann E.
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 3350 - 3354
  • [5] Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models
    Tsiligkaridis, Theodoros
    Forsythe, Keith W.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [6] High Dimensional Data Clustering by means of Distributed Dirichlet Process Mixture Models
    Meguelati, Khadidja
    Fontez, Benedicte
    Hilgert, Nadine
    Masseglia, Florent
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 890 - 899
  • [7] Distributed Collapsed Gibbs Sampler for Dirichlet Process Mixture Models in Federated Learning
    Khoufache, Reda
    Lebbah, Mustapha
    Azzag, Hanene
    Goffinet, Etienne
    Bouchaffra, Djamel
    [J]. PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 815 - 823
  • [8] Estimating mixture of Dirichlet process models
    MacEachern, SN
    Muller, P
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 1998, 7 (02) : 223 - 238
  • [9] A computational approach for full nonparametric Bayesian inference under Dirichlet process mixture models
    Gelfand, AE
    Kottas, A
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2002, 11 (02) : 289 - 305
  • [10] Mean field inference for the Dirichlet process mixture model
    Zobay, O.
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2009, 3 : 507 - 545