Bayesian inference for finite mixtures of generalized linear models with random effects

被引:0
|
作者
Peter J. Lenk
Wayne S. DeSarbo
机构
[1] University of Michigan Business School,
[2] Pennsylvania State University,undefined
来源
Psychometrika | 2000年 / 65卷
关键词
Bayesian inference; consumer behavior; finite mixtures; generalized linear models; heterogeneity; latent class analysis; Markov chain Monte Carlo;
D O I
暂无
中图分类号
学科分类号
摘要
We present an hierarchical Bayes approach to modeling parameter heterogeneity in generalized linear models. The model assumes that there are relevant subpopulations and that within each subpopulation the individual-level regression coefficients have a multivariate normal distribution. However, class membership is not known a priori, so the heterogeneity in the regression coefficients becomes a finite mixture of normal distributions. This approach combines the flexibility of semiparametric, latent class models that assume common parameters for each sub-population and the parsimony of random effects models that assume normal distributions for the regression parameters. The number of subpopulations is selected to maximize the posterior probability of the model being true. Simulations are presented which document the performance of the methodology for synthetic data with known heterogeneity and number of sub-populations. An application is presented concerning preferences for various aspects of personal computers.
引用
收藏
页码:93 / 119
页数:26
相关论文
共 50 条
  • [1] Bayesian inference for finite mixtures of generalized linear models with random effects
    Lenk, PJ
    DeSarbo, WS
    [J]. PSYCHOMETRIKA, 2000, 65 (01) : 93 - 119
  • [2] Bayesian inference for generalized linear mixed models
    Fong, Youyi
    Rue, Havard
    Wakefield, Jon
    [J]. BIOSTATISTICS, 2010, 11 (03) : 397 - 412
  • [3] Bayesian inference for sparse generalized linear models
    Seeger, Matthias
    Gerwinn, Sebastian
    Bethge, Matthias
    [J]. MACHINE LEARNING: ECML 2007, PROCEEDINGS, 2007, 4701 : 298 - +
  • [4] Bayesian inference for generalized linear models for spiking neurons
    Gerwinn, Sebastian
    Macke, Jakob H.
    Bethge, Matthias
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2010, 4
  • [5] Unified Bayesian Inference Framework for Generalized Linear Models
    Meng, Xiangming
    Wu, Sheng
    Zhu, Jiang
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (03) : 398 - 402
  • [6] Differentially Private Bayesian Inference for Generalized Linear Models
    Kulkarni, Tejas
    Jalko, Joonas
    Koskela, Antti
    Kaski, Samuel
    Honkela, Antti
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [7] Generalized linear mixed models with Gaussian mixture random effects: Inference and application
    Pan, Lanfeng
    Li, Yehua
    He, Kevin
    Li, Yanming
    Li, Yi
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 175
  • [8] Bayesian inference for linear dynamic models with Dirichlet process mixtures
    Caron, Francois
    Davy, Manuel
    Doucet, Arnaud
    Duflos, Emmanuel
    Vanheeghe, Philippe
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2008, 56 (01) : 71 - 84
  • [9] Bayesian Modeling of Random Effects Covariance Matrix for Generalized Linear Mixed Models
    Lee, Keunbaik
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2013, 20 (03) : 235 - 240
  • [10] Approximate Bayesian Inference in Spatial Generalized Linear Mixed Models
    Eidsvik, Jo
    Martino, Sara
    Rue, Havard
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2009, 36 (01) : 1 - 22