Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution

被引:0
|
作者
Dilan Görür
Carl Edward Rasmussen
机构
[1] University College London,Gatsby Computational Neuroscience Unit
[2] University of Cambridge,Department of Engineering
[3] Max Planck Institute for Biological Cybernetics,undefined
关键词
Bayesian nonparametrics; Dirichlet processes; Gaussian mixtures;
D O I
暂无
中图分类号
学科分类号
摘要
In the Bayesian mixture modeling framework it is possible to infer the necessary number of components to model the data and therefore it is unnecessary to explicitly restrict the number of components. Nonparametric mixture models sidestep the problem of finding the “correct” number of mixture components by assuming infinitely many components. In this paper Dirichlet process mixture (DPM) models are cast as infinite mixture models and inference using Markov chain Monte Carlo is described. The specification of the priors on the model parameters is often guided by mathematical and practical convenience. The primary goal of this paper is to compare the choice of conjugate and non-conjugate base distributions on a particular class of DPM models which is widely used in applications, the Dirichlet process Gaussian mixture model (DPGMM). We compare computational efficiency and modeling performance of DPGMM defined using a conjugate and a conditionally conjugate base distribution. We show that better density models can result from using a wider class of priors with no or only a modest increase in computational effort.
引用
收藏
页码:653 / 664
页数:11
相关论文
共 50 条
  • [1] Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution
    Goeruer, Dilan
    Rasmussen, Carl Edward
    [J]. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2010, 25 (04) : 653 - 664
  • [2] Dirichlet Process Gaussian Mixture Models:Choice of the Base Distribution
    Dilan Grür
    Carl Edward Rasmussen
    [J]. Journal of Computer Science & Technology, 2010, 25 (04) : 653 - 664
  • [3] On choosing the centering distribution in Dirichlet process mixture models
    Hanson, T
    Sethuraman, J
    Xu, L
    [J]. STATISTICS & PROBABILITY LETTERS, 2005, 72 (02) : 153 - 162
  • [4] Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels
    Jing, Wei
    Papathomas, Michail
    Liverani, Silvia
    [J]. INTERNATIONAL STATISTICAL REVIEW, 2024,
  • [5] Information value in nonparametric Dirichlet-process Gaussian-process (DPGP) mixture models
    Wei, Hongchuan
    Lu, Wenjie
    Zhu, Pingping
    Ferrari, Silvia
    Liu, Miao
    Klein, Robert H.
    Omidshafiei, Shayegan
    How, Jonathan P.
    [J]. AUTOMATICA, 2016, 74 : 360 - 368
  • [6] Estimating mixture of Dirichlet process models
    MacEachern, SN
    Muller, P
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 1998, 7 (02) : 223 - 238
  • [7] Density-adaptive registration of pointclouds based on Dirichlet Process Gaussian Mixture Models
    Jia, Tingting
    Taylor, Zeike A.
    Chen, Xiaojun
    [J]. PHYSICAL AND ENGINEERING SCIENCES IN MEDICINE, 2023, 46 (02) : 719 - 734
  • [8] Density-adaptive registration of pointclouds based on Dirichlet Process Gaussian Mixture Models
    Tingting Jia
    Zeike A. Taylor
    Xiaojun Chen
    [J]. Physical and Engineering Sciences in Medicine, 2023, 46 : 719 - 734
  • [9] Deep Clustering using Dirichlet Process Gaussian Mixture
    Lim, Kart-Leong
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] Classification of pulsars with Dirichlet process Gaussian mixture model
    Ay, Fahrettin
    Ince, Gokhan
    Kamasak, Mustafa E.
    Eksi, K. Yavuz
    [J]. MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2020, 493 (01) : 713 - 722