Automated Variational Inference for Gaussian Process Models

被引:0
|
作者
Nguyen, Trung, V [1 ,2 ]
Bonilla, Edwin, V [3 ]
机构
[1] Australian Natl Univ, Canberra, ACT, Australia
[2] NICTA, Canberra, ACT, Australia
[3] Univ New South Wales, Kensington, NSW, Australia
基金
澳大利亚研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We develop an automated variational method for approximate inference in Gaussian process (GP) models whose posteriors are often intractable. Using a mixture of Gaussians as the variational distribution, we show that (i) the variational objective and its gradients can be approximated efficiently via sampling from univariate Gaussian distributions and (ii) the gradients wrt the GP hyperparameters can be obtained analytically regardless of the model likelihood. We further propose two instances of the variational distribution whose covariance matrices can be parametrized linearly in the number of observations. These results allow gradient-based optimization to be done efficiently in a black-box manner. Our approach is thoroughly verified on five models using six benchmark datasets, performing as well as the exact or hard-coded implementations while running orders of magnitude faster than the alternative MCMC sampling approaches. Our method can be a valuable tool for practitioners and researchers to investigate new models with minimal effort in deriving model-specific inference algorithms.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Variational Inference for Gaussian Process Models for Survival Analysis
    Kim, Minyoung
    Pavlovic, Vladimir
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 435 - 445
  • [2] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [3] Variational Inference for Gaussian Process Models with Linear Complexity
    Cheng, Ching-An
    Boots, Byron
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [5] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [6] Fast variational inference for Gaussian process models through KL-correction
    King, Nathaniel J.
    Lawrence, Neil D.
    [J]. MACHINE LEARNING: ECML 2006, PROCEEDINGS, 2006, 4212 : 270 - 281
  • [7] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810
  • [8] Automated Augmented Conjugate Inference for Non-conjugate Gaussian Process Models
    Galy-Fajou, Theo
    Wenzel, Florian
    Opper, Manfred
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 3025 - 3034
  • [9] Structured Variational Inference in Partially Observable Unstable Gaussian Process State Space Models
    Melchior, Silvan
    Curi, Sebastian
    Berkenkamp, Felix
    Krause, Andreas
    [J]. LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 147 - 157
  • [10] Natural Gradients in Practice: Non-Conjugate Variational Inference in Gaussian Process Models
    Salimbeni, Hugh
    Eleftheriadis, Stefanos
    Hensman, James
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84