On the time complexity of regularized least square

被引:1
|
作者
Gori, Marco [1 ]
机构
[1] Univ Siena, Dipartimento Ingn Informaz, I-53100 Siena, Italy
来源
NEURAL NETS WIRN11 | 2011年 / 234卷
关键词
Computational complexity; condition number; kernel machines; regularized least square;
D O I
10.3233/978-1-60750-972-1-85
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the general framework of kernel machines, the adoption of the hinge loss has become more popular than square loss, also because of computational reasons. Since learning reduces to a linear system of equations, in case of very large tasks for which the number of examples is proportional to the input dimension, the solution of square loss regularization is O(l(3)), where l is the number of examples, and it has been claimed that learning is unaffordable for large scale problems. However, this is only an upper bound, and in-depth experimental analyses indicate that for linear kernels (or in other cases where the kernel matrix will be sparse or decomposed in a way that is known a priori), regularized least square (RLS) is substantially faster than support vector machine (SVM) both at training and test times. In this paper, we give theoretical results to support those experimental findings by proving that there are conditions under which learning of square loss regularization is Theta(l) even for large input dimensions d for which d similar or equal to l.
引用
收藏
页码:85 / 96
页数:12
相关论文
共 50 条
  • [21] Application of integral operator for regularized least-square regression
    Sun, Hongwei
    Wu, Qiang
    [J]. MATHEMATICAL AND COMPUTER MODELLING, 2009, 49 (1-2) : 276 - 285
  • [22] Estimating Parameter of Influenza Transmission using Regularized Least Square
    Nuraini, N.
    Syukriah, Y.
    Indratno, S. W.
    [J]. SYMPOSIUM ON BIOMATHEMATICS (SYMOMATH 2013), 2014, 1587 : 74 - 77
  • [23] A regularized least square based discriminative projections for feature extraction
    Yang, Wankou
    Sun, Changyin
    Zheng, Wenming
    [J]. NEUROCOMPUTING, 2016, 175 : 198 - 205
  • [24] Indoor Localization via Discriminatively Regularized Least Square Classification
    Ouyang, Robin
    Wong, Albert
    Woo, Kam
    [J]. INTERNATIONAL JOURNAL OF WIRELESS INFORMATION NETWORKS, 2011, 18 (02) : 57 - 72
  • [25] A study on regularized Weighted Least Square Support Vector Classifier
    Yang, Bo
    Shao, Quan-ming
    Pan, Li
    Li, Wen-bin
    [J]. PATTERN RECOGNITION LETTERS, 2018, 108 : 48 - 55
  • [26] Online learning with kernel regularized least mean square algorithms
    Fan, Haijin
    Song, Qing
    Shrestha, Sumit Bam
    [J]. KNOWLEDGE-BASED SYSTEMS, 2014, 59 : 21 - 32
  • [27] INCREMENTAL LOCALIZATION ALGORITHM BASED ON REGULARIZED ITERATIVELY REWEIGHTED LEAST SQUARE
    Yan, Xiaoyong
    Yang, Zhong
    Liu, Yu
    Xu, Xiaoduo
    Li, Huijun
    [J]. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2016, 41 (03) : 183 - 196
  • [28] Error analysis of regularized least-square regression with Fredholm kernel
    Tao, Yanfang
    Yuan, Peipei
    Song, Biqin
    [J]. NEUROCOMPUTING, 2017, 249 : 237 - 244
  • [29] Learning rates of least-square regularized regression with polynomial kernels
    Li BingZheng
    Wang GuoMao
    [J]. SCIENCE IN CHINA SERIES A-MATHEMATICS, 2009, 52 (04): : 687 - 700
  • [30] Learning rates of least-square regularized regression with polynomial kernels
    LI BingZheng & WANG GuoMao Department of Mathematics
    [J]. Science China Mathematics, 2009, (04) : 687 - 700