Randomized Nystrom Features for Fast Regression: An Error Analysis

被引:1
|
作者
Trokicic, Aleksandar [1 ]
Todorovic, Branimir [1 ]
机构
[1] Univ Nis, Fac Sci & Math, Nish, Serbia
来源
关键词
Kernel methods; Nystrom method; Randomized algorithms; Random features; Regression;
D O I
10.1007/978-3-030-21363-3_21
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider the problem of fast approximate kernel regression. Since kernels can map input features into the infinite dimensional space, kernel trick is used to make the algorithms tractable. However on large data set time complexity of O(n(2)) is prohibitive. Therefore, various approximation methods are employed, such as randomization. A Nystrom method (based on a random selection of columns) is usually employed. Main advantage of this algorithm is its time complexity which is reduced to O(nm(2) + m(3)). Space complexity is also reduced to O(nm) because it does not require the computation of the entire matrix. An arbitrary number m << n represents both the size of a random subset of an input set and the dimension of random feature vectors. A Nystrom method can be extended with the randomized SVD so that l (where l > m) randomly selected columns of a kernel matrix without replacement are used for a construction of m-dimensional random feature vectors while keeping time complexity linear in n. Approximated matrix computed in this way is a better approximation than the matrix computed via the Nystrom method. We will prove here that the expected error of the approximated kernel predictor derived via this method is approximately the same in expectation as the error of the error of kernel predictor. Furthermore, we will empirically show that using the l randomly selected columns of a kernel matrix for a construction of m-dimensional random feature vectors produces smaller error on a regression problem, than using m randomly selected columns.
引用
收藏
页码:249 / 257
页数:9
相关论文
共 50 条
  • [1] On Expected Error of Randomized Nystrom Kernel Regression
    Trokicic, Aleksandar
    Todorovic, Branimir
    [J]. FILOMAT, 2020, 34 (11) : 3871 - 3884
  • [2] Error Analysis of Generalized Nystrom Kernel Regression
    Chen, Hong
    Xia, Haifeng
    Cai, Weidong
    Huang, Heng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [3] General Nystrom methods in Nordsieck form: Error analysis
    D'Ambrosio, Raffaele
    De Martino, Giuseppe
    Paternoster, Beatrice
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 292 : 694 - 702
  • [4] Efficient Algorithms and Error Analysis for the Modified Nystrom Method
    Wang, Shusen
    Zhang, Zhihua
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 996 - 1004
  • [5] Fast Kernel Independent Component Analysis with Nystrom Method
    Wang, He
    Xu, Weixia
    Guan, Naiyang
    Yang, Canqun
    [J]. PROCEEDINGS OF 2016 IEEE 13TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP 2016), 2016, : 1016 - 1020
  • [6] Efficient Nystrom method for Low Rank Approximation and Error Analysis
    Patel, Lokendra Singh
    Saha, Suman
    Ghrera, S. P.
    [J]. 2015 THIRD INTERNATIONAL CONFERENCE ON IMAGE INFORMATION PROCESSING (ICIIP), 2015, : 536 - 542
  • [7] Analysis of regularized Nystrom subsampling for regression functions of low smoothness
    Lu, Shuai
    Mathe, Peter
    Pereverzyev, Sergiy, Jr.
    [J]. ANALYSIS AND APPLICATIONS, 2019, 17 (06) : 931 - 946
  • [8] RANDOMIZED NYSTROM PRECONDITIONING
    Frangella, Zachary
    Tropp, Joel A.
    Udell, Madeleine
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2023, 44 (02) : 718 - 752
  • [9] Fast Minimum Error Entropy for Linear Regression
    Li, Qiang
    Liao, Xiao
    Cui, Wei
    Wang, Ying
    Cao, Hui
    Guan, Qingshu
    [J]. ALGORITHMS, 2024, 17 (08)
  • [10] Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent
    Lu, Yichao
    Foster, Dean P.
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2014, : 525 - 532