Optimal learning with Gaussians and correntropy loss

被引:27
|
作者
Lv, Fusheng [1 ]
Fan, Jun [1 ]
机构
[1] Hong Kong Baptist Univ, Dept Math, Hong Kong, Peoples R China
关键词
Convergence rate; correntropy loss; Gaussian kernels; minimax optimality; reproducing kernel Hilbert space; QUANTILE REGRESSION; RATES; CLASSIFICATION; CRITERION;
D O I
10.1142/S0219530519410124
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Correntropy-based learning has achieved great success in practice during the last decades. It is originated from information-theoretic learning and provides an alternative to classical least squares method in the presence of non-Gaussian noise. In this paper, we investigate the theoretical properties of learning algorithms generated by Tikhonov regularization schemes associated with Gaussian kernels and correntropy loss. By choosing an appropriate scale parameter of Gaussian kernel, we show the polynomial decay of approximation error under a Sobolev smoothness condition. In addition, we employ a tight upper bound for the uniform covering number of Gaussian RKHS in order to improve the estimate of sample error. Based on these two results, we show that the proposed algorithm using varying Gaussian kernel achieves the minimax rate of convergence (up to a logarithmic factor) without knowing the smoothness level of the regression function.
引用
收藏
页码:107 / 124
页数:18
相关论文
共 50 条
  • [1] Generalized Correntropy Induced Loss Function for Deep Learning
    Chen, Liangjun
    Qu, Hua
    Zhao, Jihong
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 1428 - 1433
  • [2] Classification with Gaussians and Convex Loss
    Xiang, Dao-Hong
    Zhou, Ding-Xuan
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 1447 - 1468
  • [3] Efficient and robust deep learning with Correntropy-induced loss function
    Liangjun Chen
    Hua Qu
    Jihong Zhao
    Badong Chen
    Jose C. Principe
    [J]. Neural Computing and Applications, 2016, 27 : 1019 - 1031
  • [4] Efficient and robust deep learning with Correntropy-induced loss function
    Chen, Liangjun
    Qu, Hua
    Zhao, Jihong
    Chen, Badong
    Principe, Jose C.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2016, 27 (04): : 1019 - 1031
  • [5] Marine Animal Classification With Correntropy-Loss-Based Multiview Learning
    Cao, Zheng
    Yu, Shujian
    Ouyang, Bing
    Dalgleish, Fraser
    Vuorenkoski, Anni
    Alsenas, Gabriel
    Principe, Jose C.
    [J]. IEEE JOURNAL OF OCEANIC ENGINEERING, 2019, 44 (04) : 1116 - 1129
  • [6] Kernel Classifier with Correntropy Loss
    Pokharel, Rosha
    Principe, Jose. C.
    [J]. 2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [7] Investigation of Water Quality using Transfer Learning, Phased LSTM and Correntropy Loss
    Michieletto, Lorenzo
    Ouyang, Bing
    Wills, Paul
    [J]. BIG DATA II: LEARNING, ANALYTICS, AND APPLICATIONS, 2020, 11395
  • [8] Multikernel Correntropy for Robust Learning
    Chen, Badong
    Xie, Yuqing
    Wang, Xin
    Yuan, Zejian
    Ren, Pengju
    Qin, Jing
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13500 - 13511
  • [9] ORTHOGONAL MAXIMUM CORRENTROPY LEARNING
    Lu, Mingfei
    Chen, Badong
    [J]. 2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [10] Mixture correntropy for robust learning
    Chen, Badong
    Wang, Xin
    Lu, Na
    Wang, Shiyuan
    Cao, Jiuwen
    Qin, Jing
    [J]. PATTERN RECOGNITION, 2018, 79 : 318 - 327