Convergence rates of Kernel Conjugate Gradient for random design regression

被引:25
|
作者
Blanchard, Gilles [1 ]
Kraemer, Nicole [2 ]
机构
[1] Univ Potsdam, Math Inst, Karl Liebknecht Str 24-25, D-14476 Potsdam, Germany
[2] Staburo GmbH, Aschauer Str 30a, D-81549 Munich, Germany
关键词
Nonparametric regression; reproducing kernel Hilbert space; conjugate gradient; partial least squares; minimax convergence rates; ALGORITHMS; OPERATORS;
D O I
10.1142/S0219530516400017
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We prove statistical rates of convergence for kernel-based least squares regression from i.i.d. data using a conjugate gradient (CG) algorithm, where regularization against over-fitting is obtained by early stopping. This method is related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. Following the setting introduced in earlier related literature, we study so-called "fast convergence rates" depending on the regularity of the target regression function (measured by a source condition in terms of the kernel integral operator) and on the effective dimensionality of the data mapped into the kernel space. We obtain upper bounds, essentially matching known minimax lower bounds, for the L-2 (prediction) norm as well as for the stronger Hilbert norm, if the true regression function belongs to the reproducing kernel Hilbert space. If the latter assumption is not fulfilled, we obtain similar convergence rates for appropriate norms, provided additional unlabeled data are available.
引用
收藏
页码:763 / 794
页数:32
相关论文
共 50 条
  • [1] CONVERGENCE ANALYSIS OF KERNEL CONJUGATE GRADIENT FOR FUNCTIONAL LINEAR REGRESSION
    Gupta, N.
    Sivananthan, S.
    Sriperumbudur, B. K.
    [J]. JOURNAL OF APPLIED AND NUMERICAL ANALYSIS, 2023, 1 : 33 - 47
  • [2] Kernel conjugate gradient methods with random projections
    Lin, Junhong
    Cevher, Volkan
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2021, 55 : 223 - 269
  • [3] RANDOM DESIGN KERNEL REGRESSION ESTIMATOR
    Deshpande, Bhargavi
    Bhat, Sharada, V
    [J]. INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2019, 15 (01): : 11 - 17
  • [5] CONVERGENCE RATES OF PROXIMAL GRADIENT METHODS VIA THE CONVEX CONJUGATE
    Gutman, David H.
    Pena, Javier F.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (01) : 162 - 174
  • [6] Convergence rates for kernel regression in infinite-dimensional spaces
    Joydeep Chowdhury
    Probal Chaudhuri
    [J]. Annals of the Institute of Statistical Mathematics, 2020, 72 : 471 - 509
  • [7] Convergence rates for kernel regression in infinite-dimensional spaces
    Chowdhury, Joydeep
    Chaudhuri, Probal
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2020, 72 (02) : 471 - 509
  • [8] Improved convergence rates for some kernel random forest algorithms
    Isidoros, Iakovidis
    Arcozzi, Nicola
    [J]. MATHEMATICS IN ENGINEERING, 2024, 6 (02): : 305 - 338
  • [9] The Kernel Conjugate Gradient Algorithms
    Zhang, Ming
    Wang, Xiaojian
    Chen, Xiaoming
    Zhang, Anxue
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (16) : 4377 - 4387
  • [10] DISTRIBUTED NESTEROV GRADIENT METHODS FOR RANDOM NETWORKS: CONVERGENCE IN PROBABILITY AND CONVERGENCE RATES
    Jakovetic, Dusan
    Xavier, Joao
    Moura, Jose M. F.
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,