Rates of Convergence for Sparse Variational Gaussian Process Regression

被引:0
|
作者
Burt, David R. [1 ]
Rasmussen, Carl Edward [1 ,2 ]
van der Wilk, Mark [2 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] PROWLER Io, Cambridge, England
关键词
APPROXIMATION; MATRIX;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Excellent variational approximations to Gaussian process posteriors have been developed which avoid the O (N-3) scaling with dataset size N. They reduce the computational cost to O (NM2), with M << N the number of inducing variables, which summarise the process. While the computational cost seems to be linear in N, the true complexity of the algorithm depends on how M must increase to ensure a certain quality of approximation. We show that with high probability the KL divergence can be made arbitrarily small by growing M more slowly than N. A particular case is that for regression with normally distributed inputs in D-dimensions with the Squared Exponential kernel, M = (9(log(D) N) suffices. Our results show that as datasets grow, Gaussian process posteriors can be approximated cheaply, and provide a concrete rule for how to increase M in continual learning scenarios.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Direct quantum dynamics using variational Gaussian wavepackets and Gaussian process regression
    Polyak, Iakov
    Richings, Gareth W.
    Habershon, Scott
    Knowles, Peter J.
    JOURNAL OF CHEMICAL PHYSICS, 2019, 150 (04):
  • [42] Variational Bayesian multinomial probit regression with gaussian process priors
    Girolami, Mark
    Rogers, Simon
    NEURAL COMPUTATION, 2006, 18 (08) : 1790 - 1817
  • [43] On the impact of prior distributions on efficiency of sparse Gaussian process regression
    Esmaeilbeigi, Mohsen
    Chatrabgoun, Omid
    Daneshkhah, Alireza
    Shafa, Maryam
    ENGINEERING WITH COMPUTERS, 2023, 39 (04) : 2905 - 2925
  • [44] INCREMENTAL SPARSE PSEUDO-INPUT GAUSSIAN PROCESS REGRESSION
    Suk, Heung-Il
    Wang, Yuzhuo
    Lee, Seong-Whan
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2012, 26 (08)
  • [45] A Support Set Selection Algorithm for Sparse Gaussian Process Regression
    Guo, Xinlu
    Uehara, Kuniaki
    2015 IIAI 4TH INTERNATIONAL CONGRESS ON ADVANCED APPLIED INFORMATICS (IIAI-AAI), 2015, : 568 - 573
  • [46] Greedy forward selection algorithms to sparse Gaussian process regression
    Sun, Ping
    Yao, Xin
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 159 - +
  • [47] Automatic sparse ESM scan using Gaussian process regression
    Li, Jiangshuai
    Zhou, Jiahao
    Yong, Shaohui
    Liu, Yuanzhuo
    Khilkevich, Victor
    2020 IEEE INTERNATIONAL SYMPOSIUM ON ELECTROMAGNETIC COMPATIBILITY AND SIGNAL & POWER INTEGRITY VIRTUAL SYMPOSIUM(IEEE EMC+SIPI), 2020, : 671 - 675
  • [48] Variable selection for Gaussian process regression through a sparse projection
    Park, Chiwoo
    Borth, David J.
    Wilson, Nicholas S.
    Hunter, Chad N.
    IISE TRANSACTIONS, 2022, 54 (07) : 699 - 712
  • [49] On the impact of prior distributions on efficiency of sparse Gaussian process regression
    Mohsen Esmaeilbeigi
    Omid Chatrabgoun
    Alireza Daneshkhah
    Maryam Shafa
    Engineering with Computers, 2023, 39 : 2905 - 2925
  • [50] Online Sparse Matrix Gaussian Process Regression and Vision Applications
    Ranganathan, Ananth
    Yang, Ming-Hsuan
    COMPUTER VISION - ECCV 2008, PT I, PROCEEDINGS, 2008, 5302 : 468 - +