Rates of Convergence for Sparse Variational Gaussian Process Regression

被引:0
|
作者
Burt, David R. [1 ]
Rasmussen, Carl Edward [1 ,2 ]
van der Wilk, Mark [2 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] PROWLER Io, Cambridge, England
关键词
APPROXIMATION; MATRIX;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Excellent variational approximations to Gaussian process posteriors have been developed which avoid the O (N-3) scaling with dataset size N. They reduce the computational cost to O (NM2), with M << N the number of inducing variables, which summarise the process. While the computational cost seems to be linear in N, the true complexity of the algorithm depends on how M must increase to ensure a certain quality of approximation. We show that with high probability the KL divergence can be made arbitrarily small by growing M more slowly than N. A particular case is that for regression with normally distributed inputs in D-dimensions with the Squared Exponential kernel, M = (9(log(D) N) suffices. Our results show that as datasets grow, Gaussian process posteriors can be approximated cheaply, and provide a concrete rule for how to increase M in continual learning scenarios.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [21] Convergence Rates of Variational Inference in Sparse Deep Learning
    Cherief-Abdellatif, Badr-Eddine
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [22] A Generalized Stochastic Variational Bayesian Hyperparameter Learning Framework for Sparse Spectrum Gaussian Process Regression
    Quang Minh Hoang
    Trong Nghia Hoang
    Low, Kian Hsiang
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2007 - 2014
  • [23] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810
  • [24] Asynchronous Distributed Variational Gaussian Process for Regression
    Peng, Hao
    Zhe, Shandian
    Qi, Yuan
    Zhang, Xiao
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [25] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Bouzerdoum, Abdesselam
    Son Lam Phung
    MACHINE LEARNING, 2018, 107 (12) : 1947 - 1986
  • [26] Stochastic variational hierarchical mixture of sparse Gaussian processes for regression
    Thi Nhat Anh Nguyen
    Abdesselam Bouzerdoum
    Son Lam Phung
    Machine Learning, 2018, 107 : 1947 - 1986
  • [27] Efficient Optimization for Sparse Gaussian Process Regression
    Cao, Yanshuai
    Brubaker, Marcus A.
    Fleet, David J.
    Hertzmann, Aaron
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (12) : 2415 - 2427
  • [28] Sparse Inverse Kernel Gaussian Process Regression
    Das, Kamalika
    Srivastava, Ashok N.
    STATISTICAL ANALYSIS AND DATA MINING, 2013, 6 (03) : 205 - 220
  • [29] Recursive estimation for sparse Gaussian process regression
    Schuerch, Manuel
    Azzimonti, Dario
    Benavoli, Alessio
    Zaffalon, Marco
    AUTOMATICA, 2020, 120
  • [30] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311