Approximate Gibbs sampler for efficient inference of hierarchical Bayesian models for grouped count data

被引:0
|
作者
Yu, Jin-Zhu [1 ,2 ]
Baroud, Hiba [3 ,4 ]
机构
[1] Univ Texas Arlington, Dept Civil Engn, Arlington, TX 76019 USA
[2] Univ Texas Arlington, Dept Ind Mfg & Syst Engn, Arlington, TX 76019 USA
[3] Vanderbilt Univ, Dept Civil & Environm Engn, Nashville, TN USA
[4] Vanderbilt Univ, Dept Comp Sci, Nashville, TN USA
基金
美国国家科学基金会;
关键词
Conditional conjugacy; approximate MCMC; Gaussian approximation; intractable likelihood; CHAIN MONTE-CARLO; PREDICTION; MCMC;
D O I
10.1080/00949655.2024.2364843
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Hierarchical Bayesian Poisson regression models (HBPRMs) provide a flexible modelling approach of the relationship between predictors and count response variables. The applications of HBPRMs to large-scale datasets require efficient inference algorithms due to the high computational cost of inferring many model parameters based on random sampling. Although Markov Chain Monte Carlo (MCMC) algorithms have been widely used for Bayesian inference, sampling using this class of algorithms is time-consuming for applications with large-scale data and time-sensitive decision-making, partially due to the non-conjugacy of many models. To overcome this limitation, this research develops an approximate Gibbs sampler (AGS) to efficiently learn the HBPRMs while maintaining the inference accuracy. In the proposed sampler, the data likelihood is approximated with Gaussian distribution such that the conditional posterior of the coefficients has a closed-form solution. Numerical experiments using real and synthetic datasets with small and large counts demonstrate the superior performance of AGS in comparison to the state-of-the-art sampling algorithm, especially for large datasets.
引用
收藏
页码:3043 / 3062
页数:20
相关论文
共 50 条
  • [31] Efficient Hierarchical Bayesian Inference for Spatio-temporal Regression Models in Neuroimaging
    Hashemi, Ali
    Gao, Yijing
    Cai, Chang
    Ghosh, Sanjay
    Mueller, Klaus-Robert
    Nagarajan, Srikantan S.
    Haufe, Stefan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] Classical and Bayesian Inference for Income Distributions using Grouped Data
    Eckernkemper, Tobias
    Gribisch, Bastian
    OXFORD BULLETIN OF ECONOMICS AND STATISTICS, 2021, 83 (01) : 32 - 65
  • [33] Bayesian bootstrap analysis of doubly censored data using Gibbs sampler
    Kim, Y
    Lee, J
    Kim, J
    STATISTICA SINICA, 2005, 15 (04) : 969 - 980
  • [34] Unsupervised Grouped Axial Data Modeling via Hierarchical Bayesian Nonparametric Models With Watson Distributions
    Fan, Wentao
    Yang, Lin
    Bouguila, Nizar
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (12) : 9654 - 9668
  • [35] Hierarchical Bayesian Models for Small Area Estimation under Overdispersed Count Data
    Wulandari, Ita
    Notodiputro, Khairil Anwar
    Fitrianto, Anwar
    Kurnia, Anang
    ENGINEERING LETTERS, 2023, 31 (04) : 1333 - 1342
  • [36] Exact Bayesian inference for normal hierarchical models
    Everson, PJ
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2001, 68 (03) : 223 - 241
  • [37] BAYESIAN-INFERENCE IN THRESHOLD MODELS USING GIBBS SAMPLING
    SORENSEN, DA
    ANDERSEN, S
    GIANOLA, D
    KORSGAARD, I
    GENETICS SELECTION EVOLUTION, 1995, 27 (03) : 229 - 249
  • [38] Discovering Inductive Bias with Gibbs Priors: A Diagnostic Tool for Approximate Bayesian Inference
    Rendsburg, Luca
    Kristiadi, Agustinus
    Hennig, Philipp
    von Luxburg, Ulrike
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [39] Efficient approximate inference in Bayesian networks with continuous variables
    Li, Chenzhao
    Mahadevan, Sankaran
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2018, 169 : 269 - 280
  • [40] Grouped Spherical Data Modeling Through Hierarchical Nonparametric Bayesian Models and Its Application to fMRI Data Analysis
    Fan, Wentao
    Yang, Lin
    Bouguila, Nizar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 5566 - 5576