REGULARIZED KERNEL NETWORKS WITH CONVEX P-LIPSCHITZ LOSS

被引:0
|
作者
Pan, Weishan [1 ]
Sun, Hongwei [1 ]
机构
[1] Univ Jinan, Sch Math Sci, Jinan 250022, Peoples R China
基金
中国国家自然科学基金;
关键词
Learning theory; regularized kernel network; leave one out analysis; error bound; learning rate; CONDITIONAL QUANTILES; LEARNING RATES; REGRESSION;
D O I
10.3934/mfc.2023049
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
.We propose a loss function class called convex p-Lipschitz loss functions, which includes hinge loss, pinball loss and least square loss and so on. For regularized kernel networks and bias corrected regularized kernel networks with general convex pLipschitz loss, we establish the error analysis frameworks by employing the leave one out technique [16]. Under a mild source condition which describes how the minimizer f* of the generalization error can be approximated by the hypothesis space HK, satisfying error bounds and learning rates are deduced. Moreover, our proofs also show that bias correction method can indeed decrease the learning error.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] LEARNING RATES FOR THE KERNEL REGULARIZED REGRESSION WITH A DIFFERENTIABLE STRONGLY CONVEX LOSS
    Sheng, Baohuai
    Liu, Huanxiang
    Wang, Huimin
    [J]. COMMUNICATIONS ON PURE AND APPLIED ANALYSIS, 2020, 19 (08) : 3973 - 4005
  • [2] ERROR ANALYSIS OF KERNEL REGULARIZED PAIRWISE LEARNING WITH A STRONGLY CONVEX LOSS
    Wang, Shuhua
    Sheng, Baohuai
    [J]. MATHEMATICAL FOUNDATIONS OF COMPUTING, 2023, 6 (04): : 625 - 650
  • [3] Convex regularized recursive kernel risk-sensitive loss adaptive filtering algorithm and its performance analysis
    Su, Ben-Xue
    Yang, Kun-De
    Wu, Fei-Yun
    Liu, Tian-He
    Yang, Hui-Zhong
    [J]. SIGNAL PROCESSING, 2024, 223
  • [4] Regularized Learning with Lipschitz Loss and Sample Dependent Hypothesis Spaces
    Sheng Baohuai
    Ye Peixin
    [J]. 2011 INTERNATIONAL CONFERENCE ON COMPUTER, ELECTRICAL, AND SYSTEMS SCIENCES, AND ENGINEERING (CESSE 2011), 2011, : 131 - 133
  • [5] Robust statistical learning with Lipschitz and convex loss functions
    Geoffrey Chinot
    Guillaume Lecué
    Matthieu Lerasle
    [J]. Probability Theory and Related Fields, 2020, 176 : 897 - 940
  • [6] Robust statistical learning with Lipschitz and convex loss functions
    Chinot, Geoffrey
    Lecue, Guillaume
    Lerasle, Matthieu
    [J]. PROBABILITY THEORY AND RELATED FIELDS, 2020, 176 (3-4) : 897 - 940
  • [7] System identification through Lipschitz regularized deep neural networks
    Negrini, Elisa
    Citti, Giovanna
    Capogna, Luca
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 444
  • [8] Fully Online Regularized Classification Algorithm with Strongly Convex Loss
    Sheng Bao-Huai
    Ye Peixin
    [J]. HIGH PERFORMANCE NETWORKING, COMPUTING, AND COMMUNICATION SYSTEMS, 2011, 163 : 223 - +
  • [9] Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
    Sheng, Baohuai
    Zuo, Lan
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2021, 19 (05)
  • [10] On proximal gradient method for the convex problems regularized with the group reproducing kernel norm
    Zhang, Haibin
    Wei, Juan
    Li, Meixia
    Zhou, Jie
    Chao, Miantao
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2014, 58 (01) : 169 - 188