Gaussian process regression: Optimality, robustness, and relationship with kernel ridge regression

被引:0
|
作者
Wang, Wenjia [1 ,2 ]
Jing, Bing-Yi [3 ]
机构
[1] Hong Kong Univ Sci & Technol Guangzhou, Guangzhou, Peoples R China
[2] Hong Kong Univ Sci & Technol, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[3] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
关键词
Gaussian process regression; Bayesian machine learning; Kernel ridge regres-sion; Reproducing kernel Hilbert space; Space-filling designs; LATIN-HYPERCUBE DESIGNS; COVARIANCE FUNCTIONS; LINEAR PREDICTIONS; CONVERGENCE-RATES; RANDOM-FIELD; INTERPOLATION; CONTRACTION; BOUNDS; REGULARIZATION; DOMAINS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Gaussian process regression is widely used in many fields, for example, machine learning, reinforcement learning and uncertainty quantification. One key component of Gaussian process regression is the unknown correlation function, which needs to be specified. In this paper, we investigate what would happen if the correlation function is misspecified. We derive upper and lower error bounds for Gaussian process regression with possibly misspec-ified correlation functions. We find that when the sampling scheme is quasi-uniform, the optimal convergence rate can be attained even if the smoothness of the imposed correlation function exceeds that of the true correlation function. We also obtain convergence rates of kernel ridge regression with misspecified kernel function, where the underlying truth is a deterministic function. Our study reveals a close connection between the convergence rates of Gaussian process regression and kernel ridge regression, which is aligned with the relationship between sample paths of Gaussian process and the corresponding reproducing kernel Hilbert space. This work establishes a bridge between Bayesian learning based on Gaussian process and frequentist kernel methods with reproducing kernel Hilbert space.
引用
收藏
页码:1 / 67
页数:67
相关论文
共 50 条
  • [11] Multi-Kernel Correntropy Regression: Robustness, Optimality, and Application on Magnetometer Calibration
    Li, Shilei
    Chen, Yihan
    Lou, Yunjiang
    Shi, Dawei
    Li, Lijing
    Shi, Ling
    [J]. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, : 1 - 13
  • [12] Understanding Gaussian process regression using the equivalent kernel
    Sollich, P
    Williams, CKI
    [J]. DETERMINISTIC AND STATISTICAL METHODS IN MACHINE LEARNING, 2005, 3635 : 211 - 228
  • [13] ON THE RELATIONSHIP BETWEEN ONLINE GAUSSIAN PROCESS REGRESSION AND KERNEL LEAST MEAN SQUARES ALGORITHMS
    Van Vaerenbergh, Steven
    Fernandez-Bes, Jesus
    Elvira, Victor
    [J]. 2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
  • [14] Optimizing etching process recipe based on Kernel Ridge Regression
    Chen, Heping
    Leclair, John
    [J]. JOURNAL OF MANUFACTURING PROCESSES, 2021, 61 : 454 - 460
  • [15] FAST AND ACCURATE GAUSSIAN KERNEL RIDGE REGRESSION USING MATRIX DECOMPOSITIONS FOR PRECONDITIONING
    Shabat, Gil
    Choshen, Era
    Ben Or, Dvir
    Carmel, Nadav
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2021, 42 (03) : 1073 - 1095
  • [16] Mathematical interpretations of kernel ridge regression
    Tanaka, Akira
    Imai, Hideyuki
    Kudo, Mineichi
    Miyakoshi, Masaaki
    [J]. COMPUTING ANTICIPATORY SYSTEMS, 2006, 839 : 347 - +
  • [17] Reduced rank kernel ridge regression
    Cawley, GC
    Talbot, NLC
    [J]. NEURAL PROCESSING LETTERS, 2002, 16 (03) : 293 - 302
  • [18] Distributed kernel ridge regression with communications
    Lin, Shao-Bo
    Wang, Di
    Zhou, Ding-Xuan
    [J]. Journal of Machine Learning Research, 2020, 21
  • [19] GHI forecasting using Gaussian process regression: kernel study
    Tolba, Hanany
    Dkhili, Nouha
    Nou, Julien
    Eynard, Julien
    Thil, Stephane
    Grieu, Stephan
    [J]. IFAC PAPERSONLINE, 2019, 52 (04): : 455 - 460
  • [20] Reduced Rank Kernel Ridge Regression
    Gavin C. Cawley
    Nicola L. C. Talbot
    [J]. Neural Processing Letters, 2002, 16 : 293 - 302