FASTER KERNEL RIDGE REGRESSION USING SKETCHING AND PRECONDITIONING

被引:59
|
作者
Avron, Haim [1 ]
Clarkson, Kenneth L. [2 ]
Woodruff, David P. [3 ]
机构
[1] Tel Aviv Univ, Dept Appl Math, IL-6997801 Tel Aviv, Israel
[2] IBM Almaden Res Ctr, San Jose, CA 95120 USA
[3] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
关键词
kernel ridge regression; preconditioning; random features; PARALLEL;
D O I
10.1137/16M1105396
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Kernel ridge regression is a simple yet powerful technique for nonparametric regression whose computation amounts to solving a linear system. This system is usually dense and highly ill-conditioned. In addition, the dimensions of the matrix are the same as the number of data points, so direct methods are unrealistic for large-scale datasets. In this paper, we propose a preconditioning technique for accelerating the solution of the aforementioned linear system. The preconditioner is based on random feature maps, such as random Fourier features, which have recently emerged as a powerful technique for speeding up and scaling the training of kernel-based methods, such as kernel ridge regression, by resorting to approximations. However, random feature maps only provide crude approximations to the kernel function, so delivering state-of-the-art results by directly solving the approximated system requires the number of random features to be very large. We show that random feature maps can be much more effective in forming preconditioners, since under certain conditions a not-too-large number of random features is sufficient to yield an effective preconditioner. We empirically evaluate our method and show it is highly effective for datasets of up to one million training examples.
引用
收藏
页码:1116 / 1138
页数:23
相关论文
共 50 条
  • [31] On the Estimation of Derivatives Using Plug-in Kernel Ridge Regression Estimators
    Liu, Zejian
    Raskutti, Garvesh
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [32] Tensor-based filter design using kernel ridge regression
    Bauckhage, Christian
    [J]. 2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 1741 - 1744
  • [33] Accurate Prediction of Gas Compressibility Factor using Kernel Ridge Regression
    Maalouf, Maher
    Khoury, Naji
    Homouz, Dirar
    Polychronopoulou, Kyriaki
    [J]. 2019 FOURTH INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTATIONAL TOOLS FOR ENGINEERING APPLICATIONS (ACTEA), 2019,
  • [34] Sketch Kernel Ridge Regression Using Circulant Matrix: Algorithm and Theory
    Yin, Rong
    Liu, Yong
    Wang, Weiping
    Meng, Dan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3512 - 3524
  • [35] Adaptive parameter selection for kernel ridge regression
    Lin, Shao-Bo
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2024, 73
  • [36] Target alignment in truncated kernel ridge regression
    Amini, Arash A.
    Baumgartner, Richard
    Feng, Dai
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [37] Predicting nuclear masses with the kernel ridge regression
    Wu, X. H.
    Zhao, P. W.
    [J]. PHYSICAL REVIEW C, 2020, 101 (05)
  • [38] Estimating predictive variances with kernel ridge regression
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    Chapelle, Olivier
    [J]. MACHINE LEARNING CHALLENGES: EVALUATING PREDICTIVE UNCERTAINTY VISUAL OBJECT CLASSIFICATION AND RECOGNIZING TEXTUAL ENTAILMENT, 2006, 3944 : 56 - 77
  • [39] MODEL SELECTION OF KERNEL RIDGE REGRESSION FOR EXTRAPOLATION
    Tanaka, Akira
    Nakamura, Masanari
    Imai, Hideyuki
    [J]. 2021 IEEE 31ST INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2021,
  • [40] Randomized Sketching for Large-Scale Sparse Ridge Regression Problems
    Iyer, Chander
    Carothers, Christopher
    Drineas, Petros
    [J]. PROCEEDINGS OF SCALA 2016: 7TH WORKSHOP ON LATEST ADVANCES IN SCALABLE ALGORITHMS FOR LARGE-SCALE SYSTEMS, 2016, : 65 - 72