A Provably Communication-Efficient Asynchronous Distributed Inference Method for Convex and Nonconvex Problems

被引:1
|
作者
Ren, Jineng [1 ]
Haupt, Jarvis [1 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
关键词
Signal processing algorithms; Convergence; Optimization; Distributed algorithms; Machine learning; Principal component analysis; Network topology; Communication-efficiency; asynchronous; distributed algorithm; convergence; nonconvex; strongly convex; OPTIMIZATION; ALGORITHM; CONVERGENCE; PARALLEL; DESCENT; SCALE;
D O I
10.1109/TSP.2020.2996374
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper proposes and analyzes a communication-efficient distributed optimization framework for general nonconvex nonsmooth signal processing and machine learning problems under an asynchronous protocol. At each iteration, worker machines compute gradients of a known empirical loss function using their own local data, and a master machine solves a related minimization problem to update the current estimate. We prove that for nonconvex nonsmooth problems, the proposed algorithm converges to a stationary point with a sublinear rate over the number of communication rounds, coinciding with the best theoretical rate that can be achieved for this class of problems. Linear convergence to a global minimum is established without any statistical assumptions on the local data for problems characterized by composite loss functions whose smooth part is strongly convex. Extensive numerical experiments verify that the performance of the proposed approach indeed improves - sometimes significantly - over other state-of-the-art algorithms in terms of total communication efficiency.
引用
下载
收藏
页码:3325 / 3340
页数:16
相关论文
共 50 条
  • [1] PROVABLY COMMUNICATION-EFFICIENT ASYNCHRONOUS DISTRIBUTED INFERENCE FOR CONVEX AND NONCONVEX PROBLEMS
    Ren, Jineng
    Haupt, Jarvis
    2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 638 - 642
  • [2] Communication-Efficient Distributed Statistical Inference
    Jordan, Michael I.
    Lee, Jason D.
    Yang, Yun
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (526) : 668 - 681
  • [3] Differentially Private and Communication-Efficient Distributed Nonconvex Optimization Algorithms
    Xie, Antai
    Yi, Xinlei
    Wang, Xiaofan
    Cao, Ming
    Ren, Xiaoqiang
    arXiv, 2023,
  • [4] Heterogeneity-aware and communication-efficient distributed statistical inference
    Duan, Rui
    Ning, Yang
    Chen, Yong
    BIOMETRIKA, 2022, 109 (01) : 67 - 83
  • [5] A Communication-Efficient Decentralized Newton's Method With Provably Faster Convergence
    Liu, Huikang
    Zhang, Jiaojiao
    So, Anthony Man-Cho
    Ling, Qing
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 427 - 441
  • [6] Communication-Efficient Frank-Wolfe Algorithm for Nonconvex Decentralized Distributed Learning
    Xian, Wenhan
    Huang, Feihu
    Huang, Heng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10405 - 10413
  • [7] Communication-Efficient Regret-Optimal Distributed Online Convex Optimization
    Liu, Jiandong
    Zhang, Lan
    He, Fengxiang
    Zhang, Chi
    Jiang, Shanyang
    Li, Xiang-Yang
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2024, 35 (11) : 2270 - 2283
  • [8] Communication-Efficient Distributed Multiple Testing for Large-Scale Inference
    Pournaderi, Mehrdad
    Xiang, Yu
    arXiv, 2022,
  • [9] Communication-efficient distributed statistical inference on zero-inflated Poisson models
    Wan, Ran
    Bai, Yang
    STATISTICAL THEORY AND RELATED FIELDS, 2024, 8 (02) : 81 - 106
  • [10] On Communication-Efficient Asynchronous MPC with Adaptive Security
    Chopard, Annick
    Hirt, Martin
    Liu-Zhang, Chen-Da
    THEORY OF CRYPTOGRAPHY, TCC 2021, PT II, 2021, 13043 : 35 - 65