What are the advantages of MCMC based inference in latent variable models?

被引:25
|
作者
Paap, R [1 ]
机构
[1] Erasmus Univ, Rotterdam Inst Business Econ Studies, Rotterdam, Netherlands
关键词
Bayesian analysis; Gibbs sampler; Metropolis-Hastings sampler; data augmentation;
D O I
10.1111/1467-9574.00060
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Recent developments in Markov chain Monte Carlo [MCMC] methods have increased the popularity of Bayesian inference in many fields of research in economics, such as marketing research and financial econometrics. Gibbs sampling in combination with data augmentation allows inference in statistical/econometric models with many unobserved variables. The likelihood functions of these models may contain many integrals, which often makes a standard classical analysis difficult or even unfeasible. The advantage of the Bayesian approach using MCMC is that one only has to consider the likelihood function conditional on the unobserved variables. In many cases this implies that Bayesian parameter estimation is faster than classical maximum likelihood estimation. In this paper we illustrate the computational advantages of Bayesian estimation using MCMC in several popular latent variable models.
引用
收藏
页码:2 / 22
页数:21
相关论文
共 50 条
  • [21] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [22] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    Gadd, C.
    Wade, S.
    Shah, A. A.
    [J]. MACHINE LEARNING, 2021, 110 (06) : 1105 - 1143
  • [23] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    C. Gadd
    S. Wade
    A. A. Shah
    [J]. Machine Learning, 2021, 110 : 1105 - 1143
  • [24] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [25] Mean-field variational approximate Bayesian inference for latent variable models
    Consonni, Guido
    Marin, Jean-Michel
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 52 (02) : 790 - 798
  • [26] Inference for dynamic and latent variable models via iterated, perturbed Bayes maps
    Ionides, Edward L.
    Dao Nguyen
    Atchade, Yves
    Stoev, Stilian
    King, Aaron A.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2015, 112 (03) : 719 - 724
  • [27] Latent variable models
    Bishop, CM
    [J]. LEARNING IN GRAPHICAL MODELS, 1998, 89 : 371 - 403
  • [28] A MCMC-method for models with continuous latent responses
    Gunter Maris
    Eric Maris
    [J]. Psychometrika, 2002, 67 : 335 - 350
  • [29] Faster MCMC for Gaussian latent position network models
    Spencer, Neil A.
    Junker, Brian W.
    Sweet, Tracy M.
    [J]. NETWORK SCIENCE, 2022, 10 (01) : 20 - 45
  • [30] Interpretation and inference in mixture models: Simple MCMC works
    Geweke, John
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (07) : 3529 - 3550