Approximate Gibbs sampler for efficient inference of hierarchical Bayesian models for grouped count data

被引:0
|
作者
Yu, Jin-Zhu [1 ,2 ]
Baroud, Hiba [3 ,4 ]
机构
[1] Univ Texas Arlington, Dept Civil Engn, Arlington, TX 76019 USA
[2] Univ Texas Arlington, Dept Ind Mfg & Syst Engn, Arlington, TX 76019 USA
[3] Vanderbilt Univ, Dept Civil & Environm Engn, Nashville, TN USA
[4] Vanderbilt Univ, Dept Comp Sci, Nashville, TN USA
基金
美国国家科学基金会;
关键词
Conditional conjugacy; approximate MCMC; Gaussian approximation; intractable likelihood; CHAIN MONTE-CARLO; PREDICTION; MCMC;
D O I
10.1080/00949655.2024.2364843
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Hierarchical Bayesian Poisson regression models (HBPRMs) provide a flexible modelling approach of the relationship between predictors and count response variables. The applications of HBPRMs to large-scale datasets require efficient inference algorithms due to the high computational cost of inferring many model parameters based on random sampling. Although Markov Chain Monte Carlo (MCMC) algorithms have been widely used for Bayesian inference, sampling using this class of algorithms is time-consuming for applications with large-scale data and time-sensitive decision-making, partially due to the non-conjugacy of many models. To overcome this limitation, this research develops an approximate Gibbs sampler (AGS) to efficiently learn the HBPRMs while maintaining the inference accuracy. In the proposed sampler, the data likelihood is approximated with Gaussian distribution such that the conditional posterior of the coefficients has a closed-form solution. Numerical experiments using real and synthetic datasets with small and large counts demonstrate the superior performance of AGS in comparison to the state-of-the-art sampling algorithm, especially for large datasets.
引用
收藏
页码:3043 / 3062
页数:20
相关论文
共 50 条