Nested Distributed Gradient Methods with Adaptive Quantized Communication

被引:0
|
作者
Berahas, Albert S. [1 ]
Iakovidou, Charikleia [2 ]
Wei, Ermin [2 ]
机构
[1] Lehigh Univ, Dept Ind & Syst Engn, Bethlehem, PA 18015 USA
[2] Northwestern Univ, Dept Elect & Comp Engn, Evanston, IL USA
关键词
Distributed Optimization; Network Optimization; Optimization Algorithms; Communication; Quantization; MULTIAGENT OPTIMIZATION; SUBGRADIENT METHODS; CONVERGENCE; ALGORITHMS; CONSENSUS; TIME;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we consider minimizing a sum of local convex objective functions in a distributed setting, where communication can be costly. We propose and analyze a class of nested distributed gradient methods with adaptive quantized communication (NEAR-DGD+Q). We show the effect of performing multiple quantized communication steps on the rate of convergence and on the size of the neighborhood of convergence, and prove R-Linear convergence to the exact solution with increasing number of consensus steps and adaptive quantization. We test the performance of the method, as well as some practical variants, on quadratic functions, and show the effects of multiple quantized communication steps in terms of iterations/gradient evaluations, communication and cost.
引用
收藏
页码:1519 / 1525
页数:7
相关论文
共 50 条
  • [41] On the Convergence Rate of Distributed Gradient Methods for Finite-Sum Optimization under Communication Delays
    Doan T.T.
    Beck C.L.
    Srikant R.
    Performance Evaluation Review, 2018, 46 (01): : 93 - 95
  • [43] Distributed Kalman Filtering With Adaptive Communication
    Selvi, Daniela
    Battistelli, Giorgio
    IEEE CONTROL SYSTEMS LETTERS, 2025, 9 : 15 - 20
  • [44] Distributed Adaptive Optimization with Divisible Communication
    Xu, An
    Bai, Yang
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 654 - 670
  • [45] DISTRIBUTED IMPLEMENTATION OF NESTED COMMUNICATING SEQUENTIAL PROCESSES - COMMUNICATION AND TERMINATION
    BAIARDI, F
    FANTECHI, A
    TOMASI, A
    VANNESCHI, M
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 1987, 4 (06) : 531 - 545
  • [46] Convergence of Limited Communication Gradient Methods
    Magnusson, Sindri
    Enyioha, Chinwendu
    Li, Na
    Fischione, Carlo
    Tarokh, Vahid
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2018, 63 (05) : 1356 - 1371
  • [47] A Novel Adaptive Gradient Compression Scheme: Reducing the Communication Overhead for Distributed Deep Learning in the Internet of Things
    Luo, Peng
    Yu, F. Richard
    Chen, Jianyong
    Li, Jianqiang
    Leung, Victor C. M.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (14) : 11476 - 11486
  • [48] Distributed Optimal Allocation with Quantized Communication and Privacy-Preserving Guarantees
    Rikos, Apostolos, I
    Nylof, Jakob
    Gracy, Sebin
    Johansson, Karl H.
    IFAC PAPERSONLINE, 2022, 55 (41): : 64 - 70
  • [49] A Linear Speedup Analysis of Distributed Deep Learning with Sparse and Quantized Communication
    Jiang, Peng
    Agrawal, Gagan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [50] Quantized H∞ Estimator Over Communication Networks for Distributed Generation Units
    Mahmoud, Magdi Sadek
    Alyazidi, Nezar M.
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2020, 50 (03): : 1134 - 1146