Approximate Gibbs sampler for efficient inference of hierarchical Bayesian models for grouped count data

被引:0
|
作者
Yu, Jin-Zhu [1 ,2 ]
Baroud, Hiba [3 ,4 ]
机构
[1] Univ Texas Arlington, Dept Civil Engn, Arlington, TX 76019 USA
[2] Univ Texas Arlington, Dept Ind Mfg & Syst Engn, Arlington, TX 76019 USA
[3] Vanderbilt Univ, Dept Civil & Environm Engn, Nashville, TN USA
[4] Vanderbilt Univ, Dept Comp Sci, Nashville, TN USA
基金
美国国家科学基金会;
关键词
Conditional conjugacy; approximate MCMC; Gaussian approximation; intractable likelihood; CHAIN MONTE-CARLO; PREDICTION; MCMC;
D O I
10.1080/00949655.2024.2364843
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Hierarchical Bayesian Poisson regression models (HBPRMs) provide a flexible modelling approach of the relationship between predictors and count response variables. The applications of HBPRMs to large-scale datasets require efficient inference algorithms due to the high computational cost of inferring many model parameters based on random sampling. Although Markov Chain Monte Carlo (MCMC) algorithms have been widely used for Bayesian inference, sampling using this class of algorithms is time-consuming for applications with large-scale data and time-sensitive decision-making, partially due to the non-conjugacy of many models. To overcome this limitation, this research develops an approximate Gibbs sampler (AGS) to efficiently learn the HBPRMs while maintaining the inference accuracy. In the proposed sampler, the data likelihood is approximated with Gaussian distribution such that the conditional posterior of the coefficients has a closed-form solution. Numerical experiments using real and synthetic datasets with small and large counts demonstrate the superior performance of AGS in comparison to the state-of-the-art sampling algorithm, especially for large datasets.
引用
收藏
页码:3043 / 3062
页数:20
相关论文
共 50 条
  • [21] Locally Private Bayesian Inference for Count Models
    Schein, Aaron
    Wu, Zhiwei Steven
    Schofield, Alexandra
    Zhou, Mingyuan
    Wallach, Hanna
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Approximate Bayesian inference for mixture cure models
    E. Lázaro
    C. Armero
    V. Gómez-Rubio
    TEST, 2020, 29 : 750 - 767
  • [23] Approximate Bayesian inference for mixture cure models
    Lazaro, E.
    Armero, C.
    Gomez-Rubio, V
    TEST, 2020, 29 (03) : 750 - 767
  • [24] Approximate Bayesian inference for spatial econometrics models
    Bivand, Roger S.
    Gomez-Rubio, Virgilio
    Rue, Havard
    SPATIAL STATISTICS, 2014, 9 : 146 - 165
  • [25] Approximate Bayesian Inference in Semiparametric Copula Models
    Grazian, Clara
    Liseo, Brunero
    BAYESIAN ANALYSIS, 2017, 12 (04): : 991 - 1016
  • [26] Learning and approximate inference in dynamic hierarchical models
    Bakker, Bart
    Heskes, Tom
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 52 (02) : 821 - 839
  • [27] Bayesian approach for mixture models with grouped data
    Shiow-Lan Gau
    Jean de Dieu Tapsoba
    Shen-Ming Lee
    Computational Statistics, 2014, 29 : 1025 - 1043
  • [28] Approximate Bayesian computation for Lorenz curves from grouped data
    Genya Kobayashi
    Kazuhiko Kakamu
    Computational Statistics, 2019, 34 : 253 - 279
  • [29] Approximate Bayesian computation for Lorenz curves from grouped data
    Kobayashi, Genya
    Kakamu, Kazuhiko
    COMPUTATIONAL STATISTICS, 2019, 34 (01) : 253 - 279
  • [30] Bayesian approach for mixture models with grouped data
    Gau, Shiow-Lan
    Tapsoba, Jean de Dieu
    Lee, Shen-Ming
    COMPUTATIONAL STATISTICS, 2014, 29 (05) : 1025 - 1043