Model selection for regularized least-squares algorithm in learning theory

被引:123
|
作者
De Vito, E
Caponnetto, A
Rosasco, L
机构
[1] Univ Modena, Dipartimento Matemat, I-41100 Modena, Italy
[2] Ist Nazl Fis Nucl, Sez Genova, I-16146 Genoa, Italy
[3] Univ Genoa, DISI, I-16146 Genoa, Italy
[4] INFM, Sez Genova, I-16146 Genoa, Italy
关键词
model selection; optimal choice of parameters; regularized least-squares algorithm;
D O I
10.1007/s10208-004-0134-1
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We investigate the problem of model selection for learning algorithms depending on a continuous parameter. We propose a model selection procedure based on a worst-case analysis and on a data-independent choice of the parameter. For the regularized least-squares algorithm we bound the generalization error of the solution by a quantity depending on a few known constants and we show that the corresponding model selection procedure reduces to solving a bias-variance problem. Under suitable smoothness conditions on the regression function, we estimate the optimal parameter as a function of the number of data and we prove that this choice ensures consistency of the algorithm.
引用
收藏
页码:59 / 85
页数:27
相关论文
共 50 条
  • [1] Model Selection for Regularized Least-Squares Algorithm in Learning Theory
    E. De Vito
    A. Caponnetto
    L. Rosasco
    Foundations of Computational Mathematics, 2005, 5 : 59 - 85
  • [2] Model selection for regularized least-squares classification
    Yang, HH
    Wang, XY
    Wang, Y
    Gao, HH
    ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 565 - 572
  • [3] A Sparse Regularized Least-Squares Preference Learning Algorithm
    Tsivtsivadze, Evgeni
    Pahikkala, Tapio
    Airola, Antti
    Boberg, Jorma
    Salakoski, Tapio
    TENTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2008, 173 : 76 - 83
  • [4] Statistical and Heuristic Model Selection in Regularized Least-Squares
    Braga, Igor
    Monard, Maria Carolina
    2013 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2013, : 231 - 236
  • [5] The learning rate for regularized least-squares algorithm on the unit sphere
    Cao, Feilong
    Wang, Changmiao
    JOURNAL OF COMPUTATIONAL ANALYSIS AND APPLICATIONS, 2013, 15 (02) : 224 - 236
  • [6] Parallel Feature Selection for Regularized Least-Squares
    Okser, Sebastian
    Airola, Antti
    Aittokallio, Tero
    Salakoski, Tapio
    Pahikkala, Tapio
    APPLIED PARALLEL AND SCIENTIFIC COMPUTING (PARA 2012), 2013, 7782 : 280 - 294
  • [7] Optimal rates for the regularized least-squares algorithm
    Caponnetto, A.
    De Vito, E.
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2007, 7 (03) : 331 - 368
  • [8] Optimal Rates for the Regularized Least-Squares Algorithm
    A. Caponnetto
    E. De Vito
    Foundations of Computational Mathematics, 2007, 7 : 331 - 368
  • [9] ON ACCELERATING THE REGULARIZED ALTERNATING LEAST-SQUARES ALGORITHM FOR TENSORS
    Wang, Xiaofei
    Navasca, Carmeliza
    Kindermann, Stefan
    ELECTRONIC TRANSACTIONS ON NUMERICAL ANALYSIS, 2018, 48 : 1 - 14
  • [10] CONVERGENCE OF A REGULARIZED EUCLIDEAN RESIDUAL ALGORITHM FOR NONLINEAR LEAST-SQUARES
    Bellavia, S.
    Cartis, C.
    Gould, N. I. M.
    Morini, B.
    Toint, Ph. L.
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2010, 48 (01) : 1 - 29