On Expected Error of Randomized Nystrom Kernel Regression

被引:0
|
作者
Trokicic, Aleksandar [1 ]
Todorovic, Branimir [1 ]
机构
[1] Univ Nis, Fac Sci & Math, Dept Comp Sci, Visegradska 33, Nish 18000, Serbia
关键词
kernel regression; kernel matrix; Nystrom method; randomized svd; random features;
D O I
10.2298/FIL2011871T
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Kernel methods are a class of machine learning algorithms which learn and discover patterns in a high (possibly infinite) dimensional feature space obtained by often nonlinear, possibly infinite mapping of an input space. A major problem with kernel methods is their time complexity. For a data set with n input points a time complexity of a kernel method is O(n(3)), which is intractable for a large data set. A method based on a random Nystrom features is an approximation method that is able to reduce the time complexity to O(np(2) + p(3)) where p is the number of randomly selected input data points. A time complexity of O(p(3)) comes from the fact that a spectral decomposition needs to be performed on a p x p Gram matrix, and if p is a large number even an approximate algorithm is time consuming. In this paper we will apply the randomized SVD method instead of the spectral decomposition and further reduce the time complexity. An input parameters of a randomized SVD algorithm are p < p Gram matrix and a number m < p. In this case time complexity is O(nm(2) + p(2)m + m(3)), and linear regression is performed on a m-dimensional random features. We will prove that the error of a predictor, learned via this method is almost the same in expectation as the error of a kernel predictor. Aditionally, we will empirically show that this predictor is better than the ONE that uses only Nystrom method.
引用
收藏
页码:3871 / 3884
页数:14
相关论文
共 50 条
  • [1] Error Analysis of Generalized Nystrom Kernel Regression
    Chen, Hong
    Xia, Haifeng
    Cai, Weidong
    Huang, Heng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [2] Randomized Nystrom Features for Fast Regression: An Error Analysis
    Trokicic, Aleksandar
    Todorovic, Branimir
    [J]. ALGEBRAIC INFORMATICS, CAI 2019, 2019, 11545 : 249 - 257
  • [3] Randomized Clustered Nystrom for Large-Scale Kernel Machines
    Pourkamali-Anaraki, Farhad
    Becker, Stephen
    Wakin, Michael B.
    [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3960 - 3967
  • [4] Semiparametric regression with kernel error model
    Yuan, Ao
    de Gooijer, Jan G.
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2007, 34 (04) : 841 - 869
  • [5] Nystrom Kernel Mean Embeddings
    Chatalic, Antoine
    Schreuder, Nicolas
    Rudi, Alessandro
    Rosasco, Lorenzo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Approximate minimization of the regularized expected error over kernel models
    Kurkova, Vera
    Sanguineti, Marcello
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2008, 33 (03) : 747 - 756
  • [7] On the expected prediction error of orthogonal regression with variable components
    Hagiwara, Katsuyuki
    Ishitani, Hiroshi
    [J]. IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2006, E89A (12): : 3699 - 3709
  • [8] Large-Scale Nystrom Kernel Matrix Approximation Using Randomized SVD
    Li, Mu
    Bi, Wei
    Kwok, James T.
    Lu, Bao-Liang
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) : 152 - 164
  • [9] Relative error prediction via kernel regression smoothers
    Jones, M. C.
    Park, Heungsun
    Shin, Key-Il
    Vines, S. K.
    Jeong, Seok-Oh
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2008, 138 (10) : 2887 - 2898
  • [10] LIL for kernel estimator of error distribution in regression model
    Niu, Si-Li
    [J]. JOURNAL OF THE KOREAN MATHEMATICAL SOCIETY, 2007, 44 (04) : 835 - 844