GAUSSIAN PROCESS REGRESSION IN THE FLAT LIMIT

被引:1
|
作者
Barthelme, Simon [1 ]
Amblard, Pierre-Oliviera [1 ]
Remblay, Nicolas [1 ]
Usevich, Konstantin [1 ]
机构
[1] CNRS, GIPSA Lab, Paris, France
来源
ANNALS OF STATISTICS | 2023年 / 51卷 / 06期
关键词
Gaussian processes; fiat limit; splines; multivariate polynomials; MULTIVARIATE INTERPOLATION; MATRICES;
D O I
10.1214/23-AOS2336
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as kriging and is the Bayesian counterpart to the frequentist kernel ridge regression. Most of the theoretical work on GP regression has focused on a large-n asymptotics, characterising the behaviour of GP regression as the amount of data increases. Fixed-sample analysis is much more difficult outside of simple cases, such as locations on a regular grid.In this work, we perform a fixed-sample analysis that was first studied in the context of approximation theory by Fornberg and Driscoll (2002), called the "fiat limit". In fiat-limit asymptotics, the goal is to characterise kernel methods as the length-scale of the kernel function tends to infinity, so that kernels appear fiat over the range of the data. Surprisingly, this limit is welldefined, and displays interesting behaviour: Driscoll and Fornberg showed that radial basis interpolation converges in the fiat limit to polynomial interpolation, if the kernel is Gaussian. Subsequent work showed that this holds true in the multivariate setting as well, but that kernels other than the Gaussian may have (polyharmonic) splines as the limit interpolant.Leveraging recent results on the spectral behaviour of kernel matrices in the fiat limit, we study the fiat limit of Gaussian process regression. Results show that Gaussian process regression tends in the fiat limit to (multivariate) polynomial regression, or (polyharmonic) spline regression, depending on the kernel. Importantly, this holds for both the predictive mean and the predictive variance, so that the posterior predictive distributions become equivalent.For the proof, we introduce the notion of prediction-equivalence of semiparametric models, which lets us state fiat-limit results in a compact and unified manner. Our results have practical consequences: for instance, they show that optimal GP predictions in the sense of leave-one-out loss may occur at very large length-scales, which would be invisible to current implementations because of numerical difficulties.
引用
收藏
页码:2471 / 2505
页数:35
相关论文
共 50 条
  • [31] Hierarchical Gaussian process mixtures for regression
    J.Q. Shi
    R. Murray-Smith
    D.M. Titterington
    Statistics and Computing, 2005, 15 : 31 - 41
  • [32] Efficient sparsification for Gaussian process regression
    Schreiter, Jens
    Duy Nguyen-Tuong
    Toussaint, Marc
    NEUROCOMPUTING, 2016, 192 : 29 - 37
  • [33] Gaussian Process Regression on Nested Spaces
    Blanchet-Scalliet, Christophette
    Demory, Bruno
    Gonon, Thierry
    Helbert, Celine
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (02): : 426 - 451
  • [34] Variational Tobit Gaussian Process Regression
    Basson, Marno
    Louw, Tobias M. M.
    Smith, Theresa R. R.
    STATISTICS AND COMPUTING, 2023, 33 (03)
  • [35] Dynamic Transfer Gaussian Process Regression
    Wei, Pengfei
    Qu, Xinghua
    Song, Wen
    Ma, Zejun
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2118 - 2127
  • [36] Implicit Manifold Gaussian Process Regression
    Fichera, Bernardo
    Borovitskiy, Viacheslav
    Krause, Andreas
    Billard, Aude
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [37] Compressed Gaussian Process for Manifold Regression
    Guhaniyogi, Rajarshi
    Dunson, David B.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [38] Coded Distributed Gaussian Process Regression
    Zeulin, Nikita
    Galinina, Olga
    Himayat, Nageen
    Andreev, Sergey
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (01) : 372 - 376
  • [39] An Intuitive Tutorial to Gaussian Process Regression
    Wang, Jie
    COMPUTING IN SCIENCE & ENGINEERING, 2023, 25 (04) : 4 - 11
  • [40] Gaussian Process Regression with Measurement Error
    Iba, Yukito
    Akaho, Shotaro
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2010, E93D (10) : 2680 - 2689