MCMC for Variationally Sparse Gaussian Processes

被引:0
|
作者
Hensman, James [1 ]
Matthews, Alexander G. de G. [2 ]
Filippone, Maurizio [3 ]
Ghahramani, Zoubin [2 ]
机构
[1] Univ Lancaster, CHICAS, Lancaster, England
[2] Univ Cambridge, Cambridge, England
[3] EURECOM, Biot, France
基金
英国工程与自然科学研究理事会;
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Sparse Multimodal Gaussian Processes
    Liu, Qiuyang
    Sun, Shiliang
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING, ISCIDE 2017, 2017, 10559 : 28 - 40
  • [2] Federated Sparse Gaussian Processes
    Guo, Xiangyang
    Wu, Daqing
    Ma, Jinwen
    INTELLIGENT COMPUTING METHODOLOGIES, PT III, 2022, 13395 : 267 - 276
  • [3] An MCMC Based EM Algorithm for Mixtures of Gaussian Processes
    Wu, Di
    Chen, Ziyi
    Ma, Jinwen
    ADVANCES IN NEURAL NETWORKS - ISNN 2015, 2015, 9377 : 327 - 334
  • [4] Strong-lensing source reconstruction with variationally optimized Gaussian processes
    Karchev, Konstantin
    Coogan, Adam
    Weniger, Christoph
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2022, 512 (01) : 661 - 685
  • [5] Offline handwritten Tai Le character recognition using wavelet deep convolution features and ensemble deep variationally sparse Gaussian processes
    Guo, Hai
    Liu, Yifan
    Zhao, Jingying
    Song, Yifan
    SOFT COMPUTING, 2023, 27 (17) : 12439 - 12455
  • [6] Offline handwritten Tai Le character recognition using wavelet deep convolution features and ensemble deep variationally sparse Gaussian processes
    Hai Guo
    Yifan Liu
    Jingying Zhao
    Yifan Song
    Soft Computing, 2023, 27 : 12439 - 12455
  • [7] Accelerating pseudo-marginal MCMC using Gaussian processes
    Drovandi, Christopher C.
    Moores, Matthew T.
    Boys, Richard J.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2018, 118 : 1 - 17
  • [8] Sparse on-line Gaussian processes
    Csató, L
    Opper, M
    NEURAL COMPUTATION, 2002, 14 (03) : 641 - 668
  • [9] Doubly Sparse Variational Gaussian Processes
    Adam, Vincent
    Eleftheriadis, Stefanos
    Durrande, Nicolas
    Artemev, Artem
    Hensman, James
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2874 - 2883
  • [10] Input Dependent Sparse Gaussian Processes
    Jafrasteh, Bahram
    Villacampa-Calvo, Carlos
    Hernandez-Lobato, Daniel
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,