Generic Inference in Latent Gaussian Process Models

被引:0
|
作者
Bonilla, Edwin V. [1 ]
Krauth, Karl [2 ]
Dezfouli, Amir [1 ]
机构
[1] CSIROs Data61, Machine Learning Res Grp, Sydney, NSW 2015, Australia
[2] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
关键词
Gaussian processes; black-box likelihoods; nonlinear likelihoods; scalable inference; variational inference;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We develop an automated variational method for inference in models with Gaussian process (GP) priors and general likelihoods. The method supports multiple outputs and multiple latent functions and does not require detailed knowledge of the conditional likelihood, only needing its evaluation as a black-box function. Using a mixture of Gaussians as the variational distribution, we show that the evidence lower bound and its gradients can be estimated efficiently using samples from univariate Gaussian distributions. Furthermore, the method is scalable to large datasets which is achieved by using an augmented prior via the inducing-variable approach underpinning most sparse GP approximations, along with parallel computation and stochastic optimization. We evaluate our approach quantitatively and qualitatively with experiments on small datasets, medium-scale datasets and large datasets, showing its competitiveness under different likelihood models and sparsity levels. On the large-scale experiments involving prediction of airline delays and classification of handwritten digits, we show that our method is on par with the state-of-the-art hard-coded approaches for scalable GP regression and classification.
引用
收藏
页数:63
相关论文
共 50 条
  • [21] Variational Inference for Gaussian Process Models for Survival Analysis
    Kim, Minyoung
    Pavlovic, Vladimir
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 435 - 445
  • [22] Gaussian process latent variable models for fault detection
    Eciolaza, Luka
    Alkarouri, A.
    Lawrence, N. D.
    Kadirkamanathan, V.
    Fleming, P. J.
    [J]. 2007 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING, VOLS 1 AND 2, 2007, : 287 - 292
  • [23] Multimodal Gaussian Process Latent Variable Models with Harmonization
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5039 - 5047
  • [24] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [25] Variational Inference for Gaussian Process Models with Linear Complexity
    Cheng, Ching-An
    Boots, Byron
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [26] Inference with Deep Gaussian Process State Space Models
    Liu, Yuhao
    Ajirak, Marzieh
    Djuric, Petar M.
    [J]. 2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 792 - 796
  • [27] A Fixed-Point Operator for Inference in Variational Bayesian Latent Gaussian Models
    Sheth, Rishit
    Khardon, Roni
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 761 - 769
  • [28] Temporal alignment and latent Gaussian process factor inference in population spike trains
    Duncker, Lea
    Sahani, Maneesh
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [29] Harmonized Multimodal Learning with Gaussian Process Latent Variable Models
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (03) : 858 - 872
  • [30] Classification of Streetsigns Using Gaussian Process Latent Variable Models
    Woeber, Wilfried
    Aburaia, Mohamed
    Olaverri-Monreal, Cristina
    [J]. 2019 8TH IEEE INTERNATIONAL CONFERENCE ON CONNECTED VEHICLES AND EXPO (IIEEE CCVE), 2019,