Sparse Orthogonal Variational Inference for Gaussian Processes

被引:0
|
作者
Shi, Jiaxin [1 ]
Titsias, Michalis K. [2 ]
Mnih, Andriy [2 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] DeepMind, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [2] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [3] Doubly Sparse Variational Gaussian Processes
    Adam, Vincent
    Eleftheriadis, Stefanos
    Durrande, Nicolas
    Artemev, Artem
    Hensman, James
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2874 - 2883
  • [4] Multiview Variational Sparse Gaussian Processes
    Mao, Liang
    Sun, Shiliang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (07) : 2875 - 2885
  • [5] VARIATIONAL INFERENCE FOR INFINITE MIXTURES OF SPARSE GAUSSIAN PROCESSES THROUGH KL-CORRECTION
    Nguyen, T. N. A.
    Bouzerdourn, A.
    Phung, S. L.
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2579 - 2583
  • [6] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [7] Variational inference for sparse spectrum Gaussian process regression
    Linda S. L. Tan
    Victor M. H. Ong
    David J. Nott
    Ajay Jasra
    Statistics and Computing, 2016, 26 : 1243 - 1261
  • [8] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [9] Dual Parameterization of Sparse Variational Gaussian Processes
    Adam, Vincent
    Chang, Paul E.
    Khan, Mohammad Emtiyaz
    Solin, Arno
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [10] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810