A Provably Communication-Efficient Asynchronous Distributed Inference Method for Convex and Nonconvex Problems

被引:1
|
作者
Ren, Jineng [1 ]
Haupt, Jarvis [1 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
关键词
Signal processing algorithms; Convergence; Optimization; Distributed algorithms; Machine learning; Principal component analysis; Network topology; Communication-efficiency; asynchronous; distributed algorithm; convergence; nonconvex; strongly convex; OPTIMIZATION; ALGORITHM; CONVERGENCE; PARALLEL; DESCENT; SCALE;
D O I
10.1109/TSP.2020.2996374
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper proposes and analyzes a communication-efficient distributed optimization framework for general nonconvex nonsmooth signal processing and machine learning problems under an asynchronous protocol. At each iteration, worker machines compute gradients of a known empirical loss function using their own local data, and a master machine solves a related minimization problem to update the current estimate. We prove that for nonconvex nonsmooth problems, the proposed algorithm converges to a stationary point with a sublinear rate over the number of communication rounds, coinciding with the best theoretical rate that can be achieved for this class of problems. Linear convergence to a global minimum is established without any statistical assumptions on the local data for problems characterized by composite loss functions whose smooth part is strongly convex. Extensive numerical experiments verify that the performance of the proposed approach indeed improves - sometimes significantly - over other state-of-the-art algorithms in terms of total communication efficiency.
引用
收藏
页码:3325 / 3340
页数:16
相关论文
共 50 条
  • [31] Gradient Sparsification for Communication-Efficient Distributed Optimization
    Wangni, Jianqiao
    Wang, Jialei
    Liu, Ji
    Zhang, Tong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [32] Communication-efficient distributed mining of association rules
    Schuster, A
    Wolff, R
    SIGMOD RECORD, 2001, 30 (02) : 473 - 484
  • [33] Communication-efficient Conformal Prediction for Distributed Datasets
    Riquelme-Granada, Nery
    Luo, Zhiyuan
    Khuong An Nguyen
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179
  • [34] Communication-efficient estimation for distributed subset selection
    Chen, Yan
    Dong, Ruipeng
    Wen, Canhong
    STATISTICS AND COMPUTING, 2023, 33 (06)
  • [35] Harvesting Curvatures for Communication-Efficient Distributed Optimization
    Cardoso, Diogo
    Li, Boyue
    Chi, Yuejie
    Xavier, Joao
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 749 - 753
  • [36] Communication-efficient distributed mining of association rules
    Schuster, A
    Wolff, R
    DATA MINING AND KNOWLEDGE DISCOVERY, 2004, 8 (02) : 171 - 196
  • [37] Communication-Efficient Distributed SGD With Compressed Sensing
    Tang, Yujie
    Ramanathan, Vikram
    Zhang, Junshan
    Li, Na
    IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 2054 - 2059
  • [38] Communication-Efficient Computation on Distributed Noisy Datasets
    Zhang, Qin
    SPAA'15: PROCEEDINGS OF THE 27TH ACM SYMPOSIUM ON PARALLELISM IN ALGORITHMS AND ARCHITECTURES, 2015, : 313 - 322
  • [39] CREDO : A Communication-Efficient Distributed Estimation Algorithm
    Sahu, Anit Kumar
    Jakovetic, Dusan
    Kar, Soummya
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 516 - 520
  • [40] Communication-efficient estimation for distributed subset selection
    Yan Chen
    Ruipeng Dong
    Canhong Wen
    Statistics and Computing, 2023, 33