Slice sampling mixture models

被引:201
|
作者
Kalli, Maria [2 ]
Griffin, Jim E. [1 ]
Walker, Stephen G. [1 ]
机构
[1] Univ Kent, Inst Math Stat & Actuarial Sci, Canterbury, Kent, England
[2] Univ Kent, Ctr Hlth Serv Studies, Canterbury, Kent, England
关键词
Dirichlet process; Markov chain Monte Carlo; Mixture model; Normalized weights; Slice sampler; Hazard function; MONTE-CARLO METHODS; PROCESS HIERARCHICAL-MODELS; DIRICHLET PROCESS PRIOR; PRIORS;
D O I
10.1007/s11222-009-9150-y
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We propose a more efficient version of the slice sampler for Dirichlet process mixture models described by Walker (Commun. Stat., Simul. Comput. 36:45-54, 2007). This new sampler allows for the fitting of infinite mixture models with a wide-range of prior specifications. To illustrate this flexibility we consider priors defined through infinite sequences of independent positive random variables. Two applications are considered: density estimation using mixture models and hazard function estimation. In each case we show how the slice efficient sampler can be applied to make inference in the models. In the mixture case, two submodels are studied in detail. The first one assumes that the positive random variables are Gamma distributed and the second assumes that they are inverse-Gaussian distributed. Both priors have two hyperparameters and we consider their effect on the prior distribution of the number of occupied clusters in a sample. Extensive computational comparisons with alternative "conditional" simulation techniques for mixture models using the standard Dirichlet process prior and our new priors are made. The properties of the new priors are illustrated on a density estimation problem.
引用
收藏
页码:93 / 105
页数:13
相关论文
共 50 条
  • [31] ESTIMATION AND CLASSIFICATION FOR FINITE MIXTURE MODELS UNDER RANKED SET SAMPLING
    Hatefi, Armin
    Jozani, Mohammad Jafari
    Ziou, Djemel
    [J]. STATISTICA SINICA, 2014, 24 (02) : 675 - 698
  • [32] Unsupervised learning for finite mixture models via modified gibbs sampling
    Liu, Weifeng
    Han, Chongzhao
    Shi, Yong
    [J]. Hsi-An Chiao Tung Ta Hsueh/Journal of Xi'an Jiaotong University, 2009, 43 (02): : 15 - 19
  • [33] A Bayesian sampling framework for asymmetric generalized Gaussian mixture models learning
    Vemuri, Ravi Teja
    Azam, Muhammad
    Bouguila, Nizar
    Patterson, Zachary
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 14123 - 14134
  • [34] Gaussian mixture models for the optimal sparse sampling of offshore wind resource
    Marcille, Robin
    Thiebaut, Maxime
    Tandeo, Pierre
    Filipot, Jean-Francois
    [J]. WIND ENERGY SCIENCE, 2023, 8 (05) : 771 - 786
  • [35] Reunderstanding Slice Sampling as Parallel MCMC
    Tran, Khoa T.
    Ninness, Brett
    [J]. 2015 IEEE CONFERENCE ON CONTROL AND APPLICATIONS (CCA 2015), 2015, : 1197 - 1202
  • [36] Slice Sampling for Lattice Gaussian Distribution
    Wang, Zheng
    Ling, Cong
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 2589 - 2593
  • [37] Geometric Convergence of Elliptical Slice Sampling
    Natarovskii, Viacheslav
    Rudolf, Daniel
    Sprungk, Bjoern
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [38] Pseudo-Marginal Slice Sampling
    Murray, Iain
    Graham, Matthew M.
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 911 - 919
  • [39] Decomposed slice sampling for factorized distributions
    Wang, Jiachun
    Sun, Shiliang
    [J]. PATTERN RECOGNITION, 2020, 97
  • [40] Slice Sampling Particle Belief Propagation
    Mueller, Oliver
    Yang, Michael Ying
    Rosenhahn, Bodo
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 1129 - 1136