Regularized least square regression with dependent samples

被引:59
|
作者
Sun, Hongwei [2 ,3 ]
Wu, Qiang [1 ]
机构
[1] Duke Univ, Dept Stat Sci, Inst Genome Sci & Policy, Durham, NC 27708 USA
[2] Jinan Univ, Sch Sci, Jinan 250022, Peoples R China
[3] Beijing Normal Univ, Sch Math Sci, Beijing 100875, Peoples R China
关键词
Regularized least square regression; Integral operator; Strong mixing condition; Capacity independent error bounds; SUPPORT VECTOR MACHINES; LEARNING-THEORY; RATES;
D O I
10.1007/s10444-008-9099-y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper we study the learning performance of regularized least square regression with alpha-mixing and I center dot-mixing inputs. The capacity independent error bounds and learning rates are derived by means of an integral operator technique. Even for independent samples our learning rates improve those in the literature. The results are sharp in the sense that when the mixing conditions are strong enough the rates are shown to be close to or the same as those for learning with independent samples. They also reveal interesting phenomena of learning with dependent samples: (i) dependent samples contain less information and lead to worse error bounds than independent samples; (ii) the influence of the dependence between samples to the learning process decreases as the smoothness of the target function increases.
引用
收藏
页码:175 / 189
页数:15
相关论文
共 50 条
  • [21] The consistency of least-square regularized regression with negative association sequence
    Chen, Fen
    Zou, Bin
    Chen, Na
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2018, 16 (03)
  • [22] Learning rates of least-square regularized regression with polynomial kernels
    LI BingZheng WANG GuoMao Department of Mathematics Zhejiang University Hangzhou China
    [J]. Science in China(Series A:Mathematics), 2009, 52 (04) : 687 - 700
  • [23] The performance of semi-supervised Laplacian regularized regression with the least square loss
    Sheng, Baohuai
    Xiang, Daohong
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2017, 15 (02)
  • [24] Learning rates of least-square regularized regression with strongly mixing observation
    Zhang, Yongquan
    Cao, Feilong
    Yan, Canwei
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2012, 3 (04) : 277 - 283
  • [25] Feature selection under regularized orthogonal least square regression with optimal scaling
    Zhang, Rui
    Nie, Feiping
    Li, Xuelong
    [J]. NEUROCOMPUTING, 2018, 273 : 547 - 553
  • [26] Learning rates of least-square regularized regression with strongly mixing observation
    Yongquan Zhang
    Feilong Cao
    Canwei Yan
    [J]. International Journal of Machine Learning and Cybernetics, 2012, 3 : 277 - 283
  • [27] On the time complexity of regularized least square
    Gori, Marco
    [J]. NEURAL NETS WIRN11, 2011, 234 : 85 - 96
  • [28] Regularized Partial Least Square Regression for Continuous Decoding in Brain-Computer Interfaces
    Reza Foodeh
    Saeed Ebadollahi
    Mohammad Reza Daliri
    [J]. Neuroinformatics, 2020, 18 : 465 - 477
  • [29] Error analysis for lq-coefficient regularized moving least-square regression
    Guo, Qin
    Ye, Peixin
    [J]. JOURNAL OF INEQUALITIES AND APPLICATIONS, 2018,
  • [30] Regularized Partial Least Square Regression for Continuous Decoding in Brain-Computer Interfaces
    Foodeh, Reza
    Ebadollahi, Saeed
    Daliri, Mohammad Reza
    [J]. NEUROINFORMATICS, 2020, 18 (03) : 465 - 477