Bayesian model selection in finite mixtures by marginal density decompositions

被引:66
|
作者
Ishwaran, H
James, LF
Sun, JY
机构
[1] Cleveland Clin Fdn, Dept Biostat & Epidemiol WB4, Cleveland, OH 44195 USA
[2] Johns Hopkins Univ, Dept Math Sci, Baltimore, MD 21218 USA
[3] Case Western Reserve Univ, Dept Stat, Cleveland, OH 44106 USA
关键词
blocked Gibbs sampler; Dirichlet prior; generalized weighted Chinese restaurant identification; partition; uniformly exponentially consistent test; weighted Bayes factor;
D O I
10.1198/016214501753382255
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider the problem of estimating the number of components d and the unknown mixing distribution in a finite mixture model, in which d is bounded by some fixed finite number N. Our approach relies on the use of a prior over the space of mixing distributions with at most N components. By decomposing the resulting marginal density under this prior, we discover a weighted Bayes factor method for consistently estimating d that can be implemented by an iid generalized weighted Chinese restaurant (GWCR) Monte Carlo algorithm. We also discuss a Gibbs sampling method (the blocked Gibbs sampler) for estimating d and also the mixing distribution. We show that our resulting posterior is consistent and achieves the frequentist optimal O-p(n(-1/4)) rate of estimation. We compare the performance of the new GWCR model selection procedure with that of the Akaike information criterion and the Bayes information criterion implemented through an EM algorithm. Applications of our methods to five real datasets and simulations are considered.
引用
收藏
页码:1316 / 1332
页数:17
相关论文
共 50 条
  • [41] Penalized marginal likelihood estimation of finite mixtures of Archimedean copulas
    Kauermann, Goeran
    Meyer, Renate
    [J]. COMPUTATIONAL STATISTICS, 2014, 29 (1-2) : 283 - 306
  • [42] Penalized marginal likelihood estimation of finite mixtures of Archimedean copulas
    Göran Kauermann
    Renate Meyer
    [J]. Computational Statistics, 2014, 29 : 283 - 306
  • [43] Bayesian model selection and model averaging
    Wasserman, L
    [J]. JOURNAL OF MATHEMATICAL PSYCHOLOGY, 2000, 44 (01) : 92 - 107
  • [44] Prior and candidate models in the Bayesian analysis of finite mixtures
    Cheng, RCH
    Currie, CSM
    [J]. PROCEEDINGS OF THE 2003 WINTER SIMULATION CONFERENCE, VOLS 1 AND 2, 2003, : 392 - 398
  • [45] Bayesian Inference for Finite Mixtures in Confirmatory Factor Analysis
    Takahiro Hoshino
    [J]. Behaviormetrika, 2001, 28 (1) : 37 - 63
  • [46] Bayesian Model Selection for Diagnostics
    Provan, Gregory
    [J]. MODEL AND DATA ENGINEERING, MEDI 2015, 2015, 9344 : 248 - 256
  • [47] Model Selection for Bayesian Autoencoders
    Tran, Ba-Hien
    Rossi, Simone
    Milios, Dimitrios
    Michiardi, Pietro
    Bonilla, Edwin, V
    Filippone, Maurizio
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [48] Bayesian evidence and model selection
    Knuth, Kevin H.
    Habeck, Michael
    Malakar, Nabin K.
    Mubeen, Asim M.
    Placek, Ben
    [J]. DIGITAL SIGNAL PROCESSING, 2015, 47 : 50 - 67
  • [49] Bayesian variable selection for understanding mixtures in environmental exposures
    Kowal, Daniel R.
    Bravo, Mercedes
    Leong, Henry
    Bui, Alexander
    Griffin, Robert J.
    Ensor, Katherine B.
    Miranda, Marie Lynn
    [J]. STATISTICS IN MEDICINE, 2021, 40 (22) : 4850 - 4871
  • [50] On model selection in Bayesian regression
    Amin Ghalamfarsa Mostofi
    Javad Behboodian
    [J]. Metrika, 2007, 66 : 259 - 268