Sparse Orthogonal Variational Inference for Gaussian Processes

被引:0
|
作者
Shi, Jiaxin [1 ]
Titsias, Michalis K. [2 ]
Mnih, Andriy [2 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] DeepMind, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Deep Gaussian Processes with Importance-Weighted Variational Inference
    Salimbeni, Hugh
    Dutordoir, Vincent
    Hensman, James
    Deisenroth, Marc Peter
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Scaling Gaussian Processes with Derivative Information Using Variational Inference
    Padidar, Misha
    Zhu, Xinran
    Huang, Leo
    Gardner, Jacob R.
    Bindel, David
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] Variational zero-inflated Gaussian processes with sparse kernels
    Hegde, Pashupati
    Heinonen, Markus
    Kaski, Samuel
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 361 - 371
  • [24] Variational Inference for Latent Variables and Uncertain Inputs in Gaussian Processes
    Damianou, Andreas C.
    Titsias, Michalis K.
    Lawrence, Neil D.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17 : 1 - 62
  • [25] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [26] Conditioning Sparse Variational Gaussian Processes for Online Decision-making
    Maddox, Wesley J.
    Stanton, Samuel
    Wilson, Andrew Gordon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [27] Decoupled Variational Gaussian Inference
    Khan, Mohammad Emtiyaz
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [28] Variational Gaussian Copula Inference
    Han, Shaobo
    Liao, Xuejun
    Dunson, David B.
    Carin, Lawrence
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 829 - 838
  • [29] A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models
    Trong Nghia Hoang
    Quang Minh Hoang
    Low, Bryan Kian Hsiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [30] Variational Inference for Infinite Mixtures of Gaussian Processes With Applications to Traffic Flow Prediction
    Sun, Shiliang
    Xu, Xin
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2011, 12 (02) : 466 - 475