A UNIFIED VIEW OF DECENTRALIZED ALGORITHMS FOR SPARSE LINEAR REGRESSION

被引:0
|
作者
Maros, Marie [1 ]
Scutari, Gesualdo [1 ]
机构
[1] Purdue Univ, Sch Ind Engn, W Lafayette, IN 47907 USA
关键词
Decentralized algorithms; linear convergence; LASSO estimator; mesh networks; statistical error; DISTRIBUTED OPTIMIZATION; GRADIENT METHODS; CONVERGENCE; CONSENSUS;
D O I
10.1109/CAMSAP58249.2023.10403504
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We investigate sparse linear regression in high-dimension via the projected LASSO estimator from distributed datasets across a network of agents. This model enables the ambient dimension to scale exponentially with the total sample size. We develop a unified algorithmic framework that encompasses a variety of distributed algorithms, new and old, such as primal, primal-dual, and gradient tracking-based methods, for which we establish both statistical and computational guarantees. Our findings demonstrate that, under standard data generation models, appropriate network connectivity and algorithm tuning, the studied schemes converge to statistically optimal estimates at a linear rate. However, the dependencies of convergence rate on the ambient dimension exhibit a noteworthy difference, ranging from linear to independent scaling. This fact is a new, interesting phenomenon distinctive of the high-dimensional setting and revealed by this study.
引用
收藏
页码:471 / 475
页数:5
相关论文
共 50 条
  • [1] Incremental Sparse Linear Regression Algorithms for Face Recognition
    Liu, Xiaolan
    Liu, Jiao
    Liu, Xiaolan
    Kong, Zhaoming
    Yang, Xiaowei
    [J]. PROCEEDINGS OF 2018 TENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2018, : 69 - 74
  • [2] SIMPLE ALGORITHMS FOR SPARSE LINEAR REGRESSION WITH UNCERTAIN COVARIATES
    Chen, Yudong
    Caramanis, Constantine
    [J]. 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 413 - 415
  • [3] Unified Bayesian theory of sparse linear regression with nuisance parameters
    Jeong, Seonghyun
    Ghosal, Subhashis
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2021, 15 (01): : 3040 - 3111
  • [4] Fundamental limits and algorithms for sparse linear regression with sublinear sparsity
    Truong, Lan V.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [5] ALGORITHMS FOR ROBUST LINEAR REGRESSION BY EXPLOITING THE CONNECTION TO SPARSE SIGNAL RECOVERY
    Jin, Yuzhe
    Rao, Bhaskar D.
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 3830 - 3833
  • [6] Reducing Complexity of Echo State Networks with Sparse Linear Regression Algorithms
    Ceperic, Vladimir
    Baric, Adrijan
    [J]. 2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, : 26 - 31
  • [7] Sparse PCA from Sparse Linear Regression
    Bresler, Guy
    Park, Sung Min
    Persu, Madalina
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] Distributed Sparse Linear Regression
    Mateos, Gonzalo
    Bazerque, Juan Andres
    Giannakis, Georgios B.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (10) : 5262 - 5276
  • [9] Scaled sparse linear regression
    Sun, Tingni
    Zhang, Cun-Hui
    [J]. BIOMETRIKA, 2012, 99 (04) : 879 - 898
  • [10] SCALABLE ALGORITHMS FOR THE SPARSE RIDGE REGRESSION
    Xie, Weijun
    Deng, Xinwei
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (04) : 3359 - 3386