On the time complexity of regularized least square

被引:1
|
作者
Gori, Marco [1 ]
机构
[1] Univ Siena, Dipartimento Ingn Informaz, I-53100 Siena, Italy
来源
NEURAL NETS WIRN11 | 2011年 / 234卷
关键词
Computational complexity; condition number; kernel machines; regularized least square;
D O I
10.3233/978-1-60750-972-1-85
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the general framework of kernel machines, the adoption of the hinge loss has become more popular than square loss, also because of computational reasons. Since learning reduces to a linear system of equations, in case of very large tasks for which the number of examples is proportional to the input dimension, the solution of square loss regularization is O(l(3)), where l is the number of examples, and it has been claimed that learning is unaffordable for large scale problems. However, this is only an upper bound, and in-depth experimental analyses indicate that for linear kernels (or in other cases where the kernel matrix will be sparse or decomposed in a way that is known a priori), regularized least square (RLS) is substantially faster than support vector machine (SVM) both at training and test times. In this paper, we give theoretical results to support those experimental findings by proving that there are conditions under which learning of square loss regularization is Theta(l) even for large input dimensions d for which d similar or equal to l.
引用
收藏
页码:85 / 96
页数:12
相关论文
共 50 条
  • [1] Variable Regularized Square Root Recursive Least Square Method
    Dokoupil, Jakub
    Burlak, Vladimir
    [J]. 11TH IFAC/IEEE INTERNATIONAL CONFERENCE ON PROGRAMMABLE DEVICES AND EMBEDDED SYSTEMS (PDES 2012), 2012,
  • [2] Least Square Regularized Regression for Multitask Learning
    Xu, Yong-Li
    Chen, Di-Rong
    Li, Han-Xiong
    [J]. ABSTRACT AND APPLIED ANALYSIS, 2013,
  • [3] Least Square Regularized Regression in Sum Space
    Xu, Yong-Li
    Chen, Di-Rong
    Li, Han-Xiong
    Liu, Lu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (04) : 635 - 646
  • [4] Regularized least square regression with dependent samples
    Hongwei Sun
    Qiang Wu
    [J]. Advances in Computational Mathematics, 2010, 32 : 175 - 189
  • [5] Regularized least square regression with dependent samples
    Sun, Hongwei
    Wu, Qiang
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2010, 32 (02) : 175 - 189
  • [6] Regularized Least Square Regression for Functional Data
    Li, Han
    Cao, Ying
    [J]. 2012 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING (ICAISC 2012), 2012, 12 : 166 - 171
  • [7] REGULARIZED LEAST SQUARE ALGORITHM WITH TWO KERNELS
    Sun, Hongwei
    Liu, Ping
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2012, 10 (05)
  • [8] Regularized moving least-square method and regularized improved interpolating moving least-square method with nonsingular moment matrices
    Wang, Qiao
    Zhou, Wei
    Cheng, Yonggang
    Ma, Gang
    Chang, Xiaolin
    Miao, Yu
    Chen, E.
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2018, 325 : 120 - 145
  • [9] Local Regularized Least-Square Dimensionality Reduction
    Jia, Yangqing
    Zhang, Changshui
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 3670 - 3673
  • [10] REGULARIZED LEAST SQUARE KERNEL REGRESSION FOR STREAMING DATA
    Zheng, Xiaoqing
    Sun, Hongwei
    Wu, Qiang
    [J]. COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2021, 19 (06) : 1533 - 1548