GAUSSIAN PROCESS REGRESSION IN THE FLAT LIMIT

被引:1
|
作者
Barthelme, Simon [1 ]
Amblard, Pierre-Oliviera [1 ]
Remblay, Nicolas [1 ]
Usevich, Konstantin [1 ]
机构
[1] CNRS, GIPSA Lab, Paris, France
来源
ANNALS OF STATISTICS | 2023年 / 51卷 / 06期
关键词
Gaussian processes; fiat limit; splines; multivariate polynomials; MULTIVARIATE INTERPOLATION; MATRICES;
D O I
10.1214/23-AOS2336
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as kriging and is the Bayesian counterpart to the frequentist kernel ridge regression. Most of the theoretical work on GP regression has focused on a large-n asymptotics, characterising the behaviour of GP regression as the amount of data increases. Fixed-sample analysis is much more difficult outside of simple cases, such as locations on a regular grid.In this work, we perform a fixed-sample analysis that was first studied in the context of approximation theory by Fornberg and Driscoll (2002), called the "fiat limit". In fiat-limit asymptotics, the goal is to characterise kernel methods as the length-scale of the kernel function tends to infinity, so that kernels appear fiat over the range of the data. Surprisingly, this limit is welldefined, and displays interesting behaviour: Driscoll and Fornberg showed that radial basis interpolation converges in the fiat limit to polynomial interpolation, if the kernel is Gaussian. Subsequent work showed that this holds true in the multivariate setting as well, but that kernels other than the Gaussian may have (polyharmonic) splines as the limit interpolant.Leveraging recent results on the spectral behaviour of kernel matrices in the fiat limit, we study the fiat limit of Gaussian process regression. Results show that Gaussian process regression tends in the fiat limit to (multivariate) polynomial regression, or (polyharmonic) spline regression, depending on the kernel. Importantly, this holds for both the predictive mean and the predictive variance, so that the posterior predictive distributions become equivalent.For the proof, we introduce the notion of prediction-equivalence of semiparametric models, which lets us state fiat-limit results in a compact and unified manner. Our results have practical consequences: for instance, they show that optimal GP predictions in the sense of leave-one-out loss may occur at very large length-scales, which would be invisible to current implementations because of numerical difficulties.
引用
收藏
页码:2471 / 2505
页数:35
相关论文
共 50 条
  • [41] Scaling Gaussian Process Regression with Derivatives
    Eriksson, David
    Dong, Kun
    Lee, Eric Hans
    Bindel, David
    Wilson, Andrew Gordon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [42] Quantum algorithm for Gaussian process regression
    Chen, Meng-Han
    Yu, Chao-Hua
    Gao, Jian-Liang
    Yu, Kai
    Lin, Song
    Guo, Gong-De
    Li, Jing
    PHYSICAL REVIEW A, 2022, 106 (01)
  • [43] Online Gaussian Process Regression with Non-Gaussian Likelihood
    Seiferth, David
    Chowdhary, G.
    Muehlegg, M.
    Holzapfel, F.
    2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3134 - 3140
  • [44] Greedy Gaussian Process Regression Applied to Object Categorization and Regression
    Dey, Arka Ujjal
    Hafez, A. H. Abdul
    Harit, Gaurav
    ELEVENTH INDIAN CONFERENCE ON COMPUTER VISION, GRAPHICS AND IMAGE PROCESSING (ICVGIP 2018), 2018,
  • [45] A Gaussian limit process for optimal FIND algorithms
    Sulzbach, Henning
    Neininger, Ralph
    Drmota, Michael
    ELECTRONIC JOURNAL OF PROBABILITY, 2014, 19 : 1 - 28
  • [46] Nonnegativity-enforced Gaussian process regression
    Andrew Pensoneault
    Xiu Yang
    Xueyu Zhu
    Theoretical & Applied Mechanics Letters, 2020, (03) : 182 - 187
  • [47] The Gaussian Process Autoregressive Regression Model (GPAR)
    Requeima, James
    Tebbutt, Will
    Bruinsma, Wessel
    Turner, Richard E.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [48] Gaussian process regression with multiple response variables
    Wang, Bo
    Chen, Tao
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2015, 142 : 159 - 165
  • [49] Wrapped Gaussian Process Regression on Riemannian Manifolds
    Mallasto, Anton
    Feragen, Aasa
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 5580 - 5588
  • [50] Fast Gaussian Process Regression for Big Data
    Das, Sourish
    Roy, Sasanka
    Sambasivan, Rajiv
    BIG DATA RESEARCH, 2018, 14 : 12 - 26