A Hybrid Scan Gibbs Sampler for Bayesian Models with Latent Variables

被引:1
|
作者
Backlund, Grant [1 ]
Hobert, James P. [1 ]
Jung, Yeun Ji [2 ]
Khare, Kshitij [1 ]
机构
[1] Univ Florida, Dept Stat, Gainesville, FL 32611 USA
[2] JP Morgan Chase & Co, Model Governance Grp, New York, NY 10172 USA
关键词
Geometric ergodicity; heavy-tailed errors; linear mixed model; Markov chain Monte Carlo; sandwich algorithm; shrinkage prior; CHAIN MONTE-CARLO; DATA AUGMENTATION ALGORITHM; MARKOV-CHAINS; DISTRIBUTIONS; REGRESSION; SCHEMES; BOUNDS;
D O I
10.1214/20-STS788
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Gibbs sampling is a widely popular Markov chain Monte Carlo algorithm that can be used to analyze intractable posterior distributions associated with Bayesian hierarchical models. There are two standard versions of the Gibbs sampler: The systematic scan (SS) version, where all variables are updated at each iteration, and the random scan (RS) version, where a single, randomly selected variable is updated at each iteration. The literature comparing the theoretical properties of SS and RS Gibbs samplers is reviewed, and an alternative hybrid scan Gibbs sampler is introduced, which is particularly well suited to Bayesian models with latent variables. The word "hybrid" reflects the fact that the scan used within this algorithm has both systematic and random elements. Indeed, at each iteration, one updates the entire set of latent variables, along with a randomly chosen block of the remaining variables. The hybrid scan (HS) Gibbs sampler has important advantages over the two standard scan Gibbs samplers. First, the HS algorithm is often easier to analyze from a theoretical standpoint. In particular, it can be much easier to establish the geometric ergodicity of a HS Gibbs Markov chain than to do the same for the corresponding SS and RS versions. Second, the sandwich methodology developed in (Ann. Statist. 36 (2008) 532-554), which is also reviewed, can be applied to the HS Gibbs algorithm (but not to the standard scan Gibbs samplers). It is shown that, under weak regularity conditions, adding sandwich steps to the HS Gibbs sampler always results in a theoretically superior algorithm. Three specific Bayesian hierarchical models of varying complexity are used to illustrate the results. One is a simple location-scale model for data from the Student's t distribution, which is used as a pedagogical tool. The other two are sophisticated, yet practical Bayesian regression models.
引用
收藏
页码:379 / 399
页数:21
相关论文
共 50 条
  • [1] Stability of the Gibbs sampler for Bayesian hierarchical models
    Papaspiliopoulos, Omiros
    Roberts, Gareth
    ANNALS OF STATISTICS, 2008, 36 (01): : 95 - 117
  • [2] Estimating Bayesian Diagnostic Models with Attribute Hierarchies with the Hamiltonian-Gibbs Hybrid Sampler
    Martinez, Alfonso J.
    Templin, Jonathan
    MULTIVARIATE BEHAVIORAL RESEARCH, 2023, 58 (01) : 141 - 142
  • [3] INFERENCE FOR NONCONJUGATE BAYESIAN MODELS USING THE GIBBS SAMPLER
    CARLIN, BP
    POLSON, NG
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 1991, 19 (04): : 399 - 405
  • [4] A Bayesian approach to nonlinear latent variable models using the Gibbs sampler and the metropolis-hastings algorithm
    Gerhard Arminger
    Bengt O. Muthén
    Psychometrika, 1998, 63 : 271 - 300
  • [5] A Bayesian approach to nonlinear latent variable models using the Gibbs sampler and the Metropolis-Hastings algorithm
    Arminger, G
    PSYCHOMETRIKA, 1998, 63 (03) : 271 - 300
  • [6] Bayesian estimation of Thurstonian ranking models based on the Gibbs sampler
    Yao, G
    Böckenholt, U
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 1999, 52 : 79 - 92
  • [7] Bayesian variable selection for latent class analysis using a collapsed Gibbs sampler
    White, Arthur
    Wyse, Jason
    Murphy, Thomas Brendan
    STATISTICS AND COMPUTING, 2016, 26 (1-2) : 511 - 527
  • [8] Bayesian variable selection for latent class analysis using a collapsed Gibbs sampler
    Arthur White
    Jason Wyse
    Thomas Brendan Murphy
    Statistics and Computing, 2016, 26 : 511 - 527
  • [9] Parameter estimation in the Error-in-Variables models using the Gibbs Sampler
    Jitjareonchai, JJ
    Reilly, PM
    Duever, TA
    Chambers, DB
    CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2006, 84 (01): : 125 - 138
  • [10] A new Gibbs sampler for Bayesian lasso
    Alhamzawi, Rahim
    Taha Mohammad Ali, Haithem
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2020, 49 (07) : 1855 - 1871