Generic Inference in Latent Gaussian Process Models

被引:0
|
作者
Bonilla, Edwin V. [1 ]
Krauth, Karl [2 ]
Dezfouli, Amir [1 ]
机构
[1] CSIROs Data61, Machine Learning Res Grp, Sydney, NSW 2015, Australia
[2] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
关键词
Gaussian processes; black-box likelihoods; nonlinear likelihoods; scalable inference; variational inference;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We develop an automated variational method for inference in models with Gaussian process (GP) priors and general likelihoods. The method supports multiple outputs and multiple latent functions and does not require detailed knowledge of the conditional likelihood, only needing its evaluation as a black-box function. Using a mixture of Gaussians as the variational distribution, we show that the evidence lower bound and its gradients can be estimated efficiently using samples from univariate Gaussian distributions. Furthermore, the method is scalable to large datasets which is achieved by using an augmented prior via the inducing-variable approach underpinning most sparse GP approximations, along with parallel computation and stochastic optimization. We evaluate our approach quantitatively and qualitatively with experiments on small datasets, medium-scale datasets and large datasets, showing its competitiveness under different likelihood models and sparsity levels. On the large-scale experiments involving prediction of airline delays and classification of handwritten digits, we show that our method is on par with the state-of-the-art hard-coded approaches for scalable GP regression and classification.
引用
收藏
页数:63
相关论文
共 50 条
  • [1] Generic inference in latent Gaussian process models
    Bonilla, Edwin V.
    Krauth, Karl
    Dezfouli, Amir
    [J]. Journal of Machine Learning Research, 2019, 20
  • [2] Fully Bayesian Inference for Latent Variable Gaussian Process Models
    Yerramilli, Suraj
    Iyer, Akshay
    Chen, Wei
    Apley, Daniel W.
    [J]. SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (04): : 1357 - 1381
  • [3] Bayesian covariance estimation and inference in latent Gaussian process models
    Earls, Cecilia
    Hooker, Giles
    [J]. STATISTICAL METHODOLOGY, 2014, 18 : 79 - 100
  • [4] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [5] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    Gadd, C.
    Wade, S.
    Shah, A. A.
    [J]. MACHINE LEARNING, 2021, 110 (06) : 1105 - 1143
  • [6] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    C. Gadd
    S. Wade
    A. A. Shah
    [J]. Machine Learning, 2021, 110 : 1105 - 1143
  • [7] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [8] A Gaussian Process Latent Variable Model for BRDF Inference
    Georgoulis, Stamatios
    Vanweddingen, Vincent
    Proesmans, Marc
    Van Gool, Luc
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3559 - 3567
  • [9] Gaussian Mixture Modeling with Gaussian Process Latent Variable Models
    Nickisch, Hannes
    Rasmussen, Carl Edward
    [J]. PATTERN RECOGNITION, 2010, 6376 : 272 - 282
  • [10] Gaussian process latent class choice models
    Sfeir, Georges
    Rodrigues, Filipe
    Abou-Zeid, Maya
    [J]. TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2022, 136