What are the advantages of MCMC based inference in latent variable models?

被引:25
|
作者
Paap, R [1 ]
机构
[1] Erasmus Univ, Rotterdam Inst Business Econ Studies, Rotterdam, Netherlands
关键词
Bayesian analysis; Gibbs sampler; Metropolis-Hastings sampler; data augmentation;
D O I
10.1111/1467-9574.00060
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Recent developments in Markov chain Monte Carlo [MCMC] methods have increased the popularity of Bayesian inference in many fields of research in economics, such as marketing research and financial econometrics. Gibbs sampling in combination with data augmentation allows inference in statistical/econometric models with many unobserved variables. The likelihood functions of these models may contain many integrals, which often makes a standard classical analysis difficult or even unfeasible. The advantage of the Bayesian approach using MCMC is that one only has to consider the likelihood function conditional on the unobserved variables. In many cases this implies that Bayesian parameter estimation is faster than classical maximum likelihood estimation. In this paper we illustrate the computational advantages of Bayesian estimation using MCMC in several popular latent variable models.
引用
收藏
页码:2 / 22
页数:21
相关论文
共 50 条
  • [1] Learning Deep Latent Variable Models by Short-Run MCMC Inference with Optimal Transport Correction
    An, Dongsheng
    Xie, Jianwen
    Li, Ping
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15410 - 15419
  • [2] Spectral Latent Variable Models for perceptual inference
    Kanaujia, Atul
    Sminchisescu, Cristian
    Metaxas, Dimitris
    [J]. 2007 IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOLS 1-6, 2007, : 142 - +
  • [3] Exact Inference for Integer Latent-Variable Models
    Winner, Kevin
    Sujono, Debora
    Sheldon, Dan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [4] What is the Latent Variable in Causal Indicator Models?
    Howell, Roy D.
    [J]. MEASUREMENT-INTERDISCIPLINARY RESEARCH AND PERSPECTIVES, 2014, 12 (04) : 141 - 145
  • [5] Efficient inference for sparse latent variable models of transcriptional regulation
    Dai, Zhenwen
    Iqbal, Mudassar
    Lawrence, Neil D.
    Rattray, Magnus
    [J]. BIOINFORMATICS, 2017, 33 (23) : 3776 - 3783
  • [6] Inference and Interval Estimation for Indirect Effects With Latent Variable Models
    Falk, Carl F.
    Biesanz, Jeremy C.
    [J]. MULTIVARIATE BEHAVIORAL RESEARCH, 2011, 46 (06) : 1012 - 1012
  • [7] SINGLE INDEX LATENT VARIABLE MODELS FOR NETWORK TOPOLOGY INFERENCE
    Mei, Jonathan
    Moura, Jose M. F.
    [J]. 2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 703 - 707
  • [8] Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models
    Opper, Manfred
    Paquet, Ulrich
    Winther, Ole
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2857 - 2898
  • [9] Causal Effect Inference with Deep Latent-Variable Models
    Louizos, Christos
    Shalit, Uri
    Mooij, Joris
    Sontag, David
    Zemel, Richard
    Welling, Max
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [10] Fully Bayesian Inference for Latent Variable Gaussian Process Models
    Yerramilli, Suraj
    Iyer, Akshay
    Chen, Wei
    Apley, Daniel W.
    [J]. SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (04): : 1357 - 1381