Sketching Algorithms and Lower Bounds for Ridge Regression

被引:0
|
作者
Kacham, Praneeth [1 ]
Woodruff, David P. [1 ]
机构
[1] Carnegie Mellon Univ, Comp Sci Dept, Pittsburgh, PA 15213 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We give a sketching-based iterative algorithm that computes a 1 + epsilon approximate solution for the ridge regression problem min(x) ||Ax-b||(2)(2) +lambda||x||(2)(2) where A is an element of R-nxd with d >= n. Our algorithm, for a constant number of iterations (requiring a constant number of passes over the input), improves upon earlier work (Chowdhury et al., 2018) by requiring that the sketching matrix only has a weaker Approximate Matrix Multiplication (AMM) guarantee that depends on epsilon, along with a constant subspace embedding guarantee. The earlier work instead requires that the sketching matrix has a subspace embedding guarantee that depends on e. For example, to produce a 1+ epsilon approximate solution in 1 iteration, which requires 2 passes over the input, our algorithm requires the OSNAP embedding to have m = O(n sigma(2)/lambda epsilon) rows with a sparsity parameter s = O(log(n)), whereas the earlier algorithm of Chowdhury et al. (2018) with the same number of rows of OSNAP requires a sparsity s = O(root sigma(2)/lambda epsilon center dot log(n)), where sigma =||A||(2) is the spectral norm of the matrix A. We also show that this algorithm can be used to give faster algorithms for kernel ridge regression. Finally, we show that the sketch size required for our algorithm is essentially optimal for a natural framework of algorithms for ridge regression by proving lower bounds on oblivious sketching matrices for AMM. The sketch size lower bounds for AMM may be of independent interest.
引用
收藏
页码:10539 / 10556
页数:18
相关论文
共 50 条
  • [1] FASTER KERNEL RIDGE REGRESSION USING SKETCHING AND PRECONDITIONING
    Avron, Haim
    Clarkson, Kenneth L.
    Woodruff, David P.
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2017, 38 (04) : 1116 - 1138
  • [2] An Iterative, Sketching-based Framework for Ridge Regression
    Chowdhury, Agniva
    Yang, Jiasen
    Drineas, Petros
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [3] BOUNDS ON THE BIASING PARAMETER IN RIDGE-REGRESSION
    KADIYALA, K
    [J]. COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1981, 10 (22): : 2369 - 2372
  • [4] Randomized Sketching for Large-Scale Sparse Ridge Regression Problems
    Iyer, Chander
    Carothers, Christopher
    Drineas, Petros
    [J]. PROCEEDINGS OF SCALA 2016: 7TH WORKSHOP ON LATEST ADVANCES IN SCALABLE ALGORITHMS FOR LARGE-SCALE SYSTEMS, 2016, : 65 - 72
  • [5] RIDGESKETCH: A FAST SKETCHING BASED SOLVER FOR LARGE SCALE RIDGE REGRESSION
    Gazagnadou, Nidham
    Ibrahim, Mark
    Gower, Robert M.
    [J]. SIAM Journal on Matrix Analysis and Applications, 2022, 43 (03) : 1440 - 1468
  • [6] SCALABLE ALGORITHMS FOR THE SPARSE RIDGE REGRESSION
    Xie, Weijun
    Deng, Xinwei
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (04) : 3359 - 3386
  • [7] Distributed Sketching for Randomized Optimization: Exact Characterization, Concentration, and Lower Bounds
    Bartan, Burak
    Pilanci, Mert
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (06) : 3850 - 3879
  • [8] BOUNDS ON MINIMUM MEAN SQUARED ERROR IN RIDGE REGRESSION
    BALDWIN, KF
    HOERL, AE
    [J]. COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1978, 7 (13): : 1209 - 1218
  • [9] COORDINATE-FREE RIDGE-REGRESSION BOUNDS
    LEAMER, EE
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1981, 76 (376) : 842 - 849
  • [10] Kalman filter and ridge regression backpropagation algorithms
    Neamah, Irtefaa A.
    Redha, Zainab Abdul
    [J]. INTERNATIONAL JOURNAL OF NONLINEAR ANALYSIS AND APPLICATIONS, 2021, 12 (02): : 485 - 493