On posterior contraction of parameters and interpretability in Bayesian mixture modeling

被引:15
|
作者
Guha, Aritra [1 ]
Ho, Nhat [2 ]
Nguyen, Xuanlong [3 ]
机构
[1] Duke Univ, Dept Stat Sci, Durham, NC 27708 USA
[2] Univ Texas Austin, Dept Stat & Data Sci, Austin, TX 78712 USA
[3] Univ Michigan, Dept Stat, Ann Arbor, MI 48109 USA
关键词
Mixture models; Wasserstein distance; Bayesian nonparametrics; Bayesian inference; misspecified models; post-processing algorithm; DENSITY-ESTIMATION; CONVERGENCE-RATES; UNKNOWN NUMBER; STRONG IDENTIFIABILITY; DIRICHLET MIXTURES; FINITE MIXTURES; MIXING MEASURES; PITMAN-YOR; COMPONENTS; DECONVOLUTION;
D O I
10.3150/20-BEJ1275
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study posterior contraction behaviors for parameters of interest in the context of Bayesian mixture modeling, where the number of mixing components is unknown while the model itself may or may not be correctly specified. Two representative types of prior specification will be considered: one requires explicitly a prior distribution on the number of mixture components, while the other places a nonparametric prior on the space of mixing distributions. The former is shown to yield an optimal rate of posterior contraction on the model parameters under minimal conditions, while the latter can be utilized to consistently recover the unknown number of mixture components, with the help of a fast probabilistic post-processing procedure. We then turn the study of these Bayesian procedures to the realistic settings of model misspecification. It will be shown that the modeling choice of kernel density functions plays perhaps the most impactful roles in determining the posterior contraction rates in the misspecified situations. Drawing on concrete posterior contraction rates established in this paper we wish to highlight some aspects about the interesting tradeoffs between model expressiveness and interpretability that a statistical modeler must negotiate in the rich world of mixture modeling.
引用
收藏
页码:2159 / 2188
页数:30
相关论文
共 50 条
  • [1] Strategies for improving the modeling and interpretability of Bayesian networks
    de Santana, Adamo L.
    Frances, Carlos R.
    Rocha, Claudio A.
    Carvalho, Solon V.
    Vijaykumar, Nandamudi L.
    Rego, Liviane P.
    Costa, Joao C.
    [J]. DATA & KNOWLEDGE ENGINEERING, 2007, 63 (01) : 91 - 107
  • [2] Bayesian Posterior Interval Calibration to Improve the Interpretability of Observational Studies
    Mulgrave, Jami J.
    Madigan, David
    Hripcsak, George
    [J]. Statistical Analysis and Data Mining, 2024, 17 (06)
  • [3] The Ability for Posterior Predictive Checking to Identify Model Misspecification in Bayesian Growth Mixture Modeling
    Depaoli, Sarah
    [J]. STRUCTURAL EQUATION MODELING-A MULTIDISCIPLINARY JOURNAL, 2012, 19 (04) : 534 - 560
  • [4] Bayesian Mixture Labeling by Highest Posterior Density
    Yao, Weixin
    Lindsay, Bruce G.
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2009, 104 (486) : 758 - 767
  • [5] The Impact of Inaccurate "Informative" Priors for Growth Parameters in Bayesian Growth Mixture Modeling
    Depaoli, Sarah
    [J]. STRUCTURAL EQUATION MODELING-A MULTIDISCIPLINARY JOURNAL, 2014, 21 (02) : 239 - 252
  • [6] A posterior contraction for Bayesian inverse problems in Banach spaces
    Chen, De-Han
    Li, Jingzhi
    Zhang, Ye
    [J]. INVERSE PROBLEMS, 2024, 40 (04)
  • [7] The Population Posterior and Bayesian Modeling on Streams
    McInerney, James
    Ranganath, Rajesh
    Blei, David
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [8] Bayesian approaches to Gaussian mixture modeling
    Roberts, SJ
    Husmeier, D
    Rezek, I
    Penny, W
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1998, 20 (11) : 1133 - 1142
  • [9] Adaptive rates of contraction of posterior distributions in Bayesian wavelet regression
    Lian, Heng
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2014, 145 : 92 - 101
  • [10] Posterior Contraction in Bayesian Inverse Problems Under Gaussian Priors
    Agapiou, Sergios
    Mathe, Peter
    [J]. NEW TRENDS IN PARAMETER IDENTIFICATION FOR MATHEMATICAL MODELS, 2018, : 1 - 29