Expectation-maximization algorithms for inference in Dirichlet processes mixture

被引:0
|
作者
T. Kimura
T. Tokuda
Y. Nakada
T. Nokajima
T. Matsumoto
A. Doucet
机构
[1] Waseda University,Department of Electrical Engineering and Bioscience
[2] University of British Columbia,Departments of Computer Science and Statistics
[3] Aoyama Gakuin University,College of Science and Engineering
来源
关键词
Clustering; Dirichlet processes; Expectation-maximization; Finite mixture models;
D O I
暂无
中图分类号
学科分类号
摘要
Mixture models are ubiquitous in applied science. In many real-world applications, the number of mixture components needs to be estimated from the data. A popular approach consists of using information criteria to perform model selection. Another approach which has become very popular over the past few years consists of using Dirichlet processes mixture (DPM) models. Both approaches are computationally intensive. The use of information criteria requires computing the maximum likelihood parameter estimates for each candidate model whereas DPM are usually trained using Markov chain Monte Carlo (MCMC) or variational Bayes (VB) methods. We propose here original batch and recursive expectation-maximization algorithms to estimate the parameters of DPM. The performance of our algorithms is demonstrated on several applications including image segmentation and image classification tasks. Our algorithms are computationally much more efficient than MCMC and VB and outperform VB on an example.
引用
收藏
页码:55 / 67
页数:12
相关论文
共 50 条
  • [1] Expectation-maximization algorithms for inference in Dirichlet processes mixture
    Kimura, T.
    Tokuda, T.
    Nakada, Y.
    Nokajima, T.
    Matsumoto, T.
    Doucet, A.
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2013, 16 (01) : 55 - 67
  • [2] Message passing expectation-maximization algorithms
    O'Sullivan, Joseph A.
    [J]. 2005 IEEE/SP 13th Workshop on Statistical Signal Processing (SSP), Vols 1 and 2, 2005, : 781 - 786
  • [3] Parameter Estimation for Gaussian Mixture Processes based on Expectation-Maximization Method
    Xia, Xue
    Zhang, Xuebo
    Chen, Xiaohui
    [J]. PROCEEDINGS OF THE 2016 4TH INTERNATIONAL CONFERENCE ON MACHINERY, MATERIALS AND INFORMATION TECHNOLOGY APPLICATIONS, 2016, 71 : 519 - 523
  • [4] ON THE BEHAVIOR OF THE EXPECTATION-MAXIMIZATION ALGORITHM FOR MIXTURE MODELS
    Barazandeh, Babak
    Razaviyayn, Meisam
    [J]. 2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 61 - 65
  • [5] Quantum Expectation-Maximization for Gaussian mixture models
    Kerenidis, Iordanis
    Luongo, Alessandro
    Prakash, Anupam
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] An Improved Mixture Model of Gaussian Processes and Its Classification Expectation-Maximization Algorithm
    Xie, Yurong
    Wu, Di
    Qiang, Zhe
    [J]. MATHEMATICS, 2023, 11 (10)
  • [7] Accelerating Expectation-Maximization Algorithms with Frequent Updates
    Yin, Jiangtao
    Zhang, Yanfeng
    Gao, Lixin
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON CLUSTER COMPUTING (CLUSTER), 2012, : 275 - 283
  • [8] Expectation-Maximization for Learning Determinantal Point Processes
    Gillenwater, Jennifer
    Kulesza, Alex
    Fox, Emily
    Taskar, Ben
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [9] Accelerated distributed expectation-maximization algorithms for the parameter estimation in multivariate Gaussian mixture models
    Guo, Guangbao
    Wang, Qian
    Allison, James
    Qian, Guoqi
    [J]. Applied Mathematical Modelling, 2025, 137
  • [10] Expectation-Maximization for Adaptive Mixture Models in Graph Optimization
    Pfeifer, Tim
    Protzel, Peter
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 3151 - 3157