Rates of Convergence for Sparse Variational Gaussian Process Regression

被引:0
|
作者
Burt, David R. [1 ]
Rasmussen, Carl Edward [1 ,2 ]
van der Wilk, Mark [2 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] PROWLER Io, Cambridge, England
关键词
APPROXIMATION; MATRIX;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Excellent variational approximations to Gaussian process posteriors have been developed which avoid the O (N-3) scaling with dataset size N. They reduce the computational cost to O (NM2), with M << N the number of inducing variables, which summarise the process. While the computational cost seems to be linear in N, the true complexity of the algorithm depends on how M must increase to ensure a certain quality of approximation. We show that with high probability the KL divergence can be made arbitrarily small by growing M more slowly than N. A particular case is that for regression with normally distributed inputs in D-dimensions with the Squared Exponential kernel, M = (9(log(D) N) suffices. Our results show that as datasets grow, Gaussian process posteriors can be approximated cheaply, and provide a concrete rule for how to increase M in continual learning scenarios.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [1] Contraction rates for sparse variational approximations in Gaussian process regression
    Nieman, Dennis
    Szabo, Botond
    van Zanten, Harry
    Journal of Machine Learning Research, 2022, 23
  • [2] Contraction rates for sparse variational approximations in Gaussian process regression
    Nieman, Dennis
    Szabo, Botond
    van Zanten, Harry
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [3] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [4] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [5] Incremental Variational Sparse Gaussian Process Regression
    Cheng, Ching-An
    Boots, Byron
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [6] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [7] Variational inference for sparse spectrum Gaussian process regression
    Linda S. L. Tan
    Victor M. H. Ong
    David J. Nott
    Ajay Jasra
    Statistics and Computing, 2016, 26 : 1243 - 1261
  • [8] Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression
    Yu, Haibin
    Trong Nghia Hoang
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [9] Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression
    Teng, Tong
    Chen, Jie
    Zhang, Yehong
    Low, Kian Hsiang
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5997 - 6004
  • [10] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27