Regularized least square regression with dependent samples

被引:59
|
作者
Sun, Hongwei [2 ,3 ]
Wu, Qiang [1 ]
机构
[1] Duke Univ, Dept Stat Sci, Inst Genome Sci & Policy, Durham, NC 27708 USA
[2] Jinan Univ, Sch Sci, Jinan 250022, Peoples R China
[3] Beijing Normal Univ, Sch Math Sci, Beijing 100875, Peoples R China
关键词
Regularized least square regression; Integral operator; Strong mixing condition; Capacity independent error bounds; SUPPORT VECTOR MACHINES; LEARNING-THEORY; RATES;
D O I
10.1007/s10444-008-9099-y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper we study the learning performance of regularized least square regression with alpha-mixing and I center dot-mixing inputs. The capacity independent error bounds and learning rates are derived by means of an integral operator technique. Even for independent samples our learning rates improve those in the literature. The results are sharp in the sense that when the mixing conditions are strong enough the rates are shown to be close to or the same as those for learning with independent samples. They also reveal interesting phenomena of learning with dependent samples: (i) dependent samples contain less information and lead to worse error bounds than independent samples; (ii) the influence of the dependence between samples to the learning process decreases as the smoothness of the target function increases.
引用
收藏
页码:175 / 189
页数:15
相关论文
共 50 条
  • [31] Variable Regularized Square Root Recursive Least Square Method
    Dokoupil, Jakub
    Burlak, Vladimir
    [J]. 11TH IFAC/IEEE INTERNATIONAL CONFERENCE ON PROGRAMMABLE DEVICES AND EMBEDDED SYSTEMS (PDES 2012), 2012,
  • [32] Error Analysis of Least-Squares lq-Regularized Regression Learning Algorithm With the Non-Identical and Dependent Samples
    Guo, Qin
    Ye, Peixin
    [J]. IEEE ACCESS, 2018, 6 : 43824 - 43829
  • [33] REGULARIZED LEAST SQUARE ALGORITHM WITH TWO KERNELS
    Sun, Hongwei
    Liu, Ping
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2012, 10 (05)
  • [34] Least Square Regression Learning with Data Dependent Hypothesis and Coefficient Regularzation
    Sheng, Bao-Huai
    Ye, Pei-Xin
    [J]. JOURNAL OF COMPUTERS, 2011, 6 (04) : 671 - 675
  • [35] The Application of Partial Least Square Regression in Dimensional Reduction Analysis of Spirit Samples
    Zuo Chen
    Zhong Qiding
    Xie Yihui
    Li Qiu
    [J]. PLS '09: PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON PARTIAL LEAST SQUARES AND RELATED METHODS, 2009, : 360 - 364
  • [36] Regularized moving least-square method and regularized improved interpolating moving least-square method with nonsingular moment matrices
    Wang, Qiao
    Zhou, Wei
    Cheng, Yonggang
    Ma, Gang
    Chang, Xiaolin
    Miao, Yu
    Chen, E.
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2018, 325 : 120 - 145
  • [37] Co-Regularized Least Square Regression for Multi-View Multi-Class Classification
    Lan, Chao
    Deng, Yujie
    Li, Xiaoli
    Huan, Jun
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 342 - 347
  • [38] Local Regularized Least-Square Dimensionality Reduction
    Jia, Yangqing
    Zhang, Changshui
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 3670 - 3673
  • [39] Regularized Least Square based Identification for Wiener Systems
    Saini, Vikram
    Dewan, Lillie
    [J]. 2016 11TH INTERNATIONAL CONFERENCE ON INDUSTRIAL AND INFORMATION SYSTEMS (ICIIS), 2016, : 861 - 866
  • [40] A Laplacian Regularized Least Square Algorithm for Motion Tomography
    Ouerghi, Meriam
    Zhang, Fumin
    [J]. 2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 2017 - 2022