Communication-Efficient Distributed Minimax Optimization via Markov Compression

被引:0
|
作者
Yang, Linfeng [1 ]
Zhang, Zhen [1 ]
Che, Keqin [1 ]
Yang, Shaofu [1 ]
Wang, Suyang [2 ]
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing, Peoples R China
[2] Jiangsu Jinheng Informat Technol Co Ltd, Wujin, Peoples R China
基金
中国国家自然科学基金;
关键词
Minimax optimization; Parameter-server framework; Communication compression; Distributed optimization;
D O I
10.1007/978-981-99-8079-6_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, the minimax problem has attracted a lot of attention due to its wide applications in modern machine learning fields such as GANs. With the exponential growth of data volumes and increasing problem sizes, the design of distributed algorithms to train high-performance models has become imperative. However, distributed algorithms often suffer from communication bottlenecks. To address this challenge, in this paper, we propose a communication-efficient distributed compressed stochastic gradient descent ascent algorithm, abbreviated as DCSGDA, in a parameter-server setting. To reduce the communication cost, each client in DCSGDA transmits the compressed gradients of the primal and dual variables to the server at each iteration. In particular, we leverage a Markov compression mechanism that allows both unbiased and biased compressors to mitigate the negative effect of compression errors on convergence. Namely, we show theoretically that the DCSGDA algorithm can still achieve linear convergence in the presence of compression errors, provided that the local objective function is strongly-convex-strongly-concave. Finally, numerical experiments demonstrate the desirable communication efficiency and efficacy of the proposed DCSGDA.
引用
收藏
页码:540 / 551
页数:12
相关论文
共 50 条
  • [31] Communication-Efficient Distributed Optimization in Networks with Gradient Tracking and Variance Reduction
    Li, Boyue
    Cen, Shicong
    Chen, Yuxin
    Chi, Yuejie
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [32] D-ADMM: A Communication-Efficient Distributed Algorithm for Separable Optimization
    Mota, Joao F. C.
    Xavier, Joao M. F.
    Aguiar, Pedro M. Q.
    Pueschel, Markus
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (10) : 2718 - 2723
  • [33] FedCO: Communication-Efficient Federated Learning via Clustering Optimization
    Al-Saedi, Ahmed A.
    Boeva, Veselka
    Casalicchio, Emiliano
    [J]. FUTURE INTERNET, 2022, 14 (12)
  • [34] Communication-Efficient Distributed Learning via Lazily Aggregated Quantized Gradients
    Sun, Jun
    Chen, Tianyi
    Giannakis, Georgios B.
    Yang, Zaiyue
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [35] A Communication-efficient Algorithm with Linear Convergence for Federated Minimax Learning
    Sun, Zhenyu
    Wei, Ermin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [36] Efficient-Adam: Communication-Efficient Distributed Adam
    Chen C.
    Shen L.
    Liu W.
    Luo Z.-Q.
    [J]. IEEE Transactions on Signal Processing, 2023, 71 : 3257 - 3266
  • [37] Communication-Efficient Algorithms for Statistical Optimization
    Zhang, Yuchen
    Duchi, John C.
    Wainwright, Martin J.
    [J]. 2012 IEEE 51ST ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2012, : 6792 - 6792
  • [38] Hierarchical Privacy-Preserving and Communication-Efficient Compression via Compressed Sensing
    Huang, Hui
    Xiao, Di
    Wang, Mengdi
    [J]. 2023 DATA COMPRESSION CONFERENCE, DCC, 2023, : 342 - 342
  • [39] Adaptive and Communication-Efficient Zeroth-Order Optimization for Distributed Internet of Things
    Dang, Qianlong
    Yang, Shuai
    Liu, Qiqi
    Ruan, Junhu
    [J]. IEEE Internet of Things Journal, 2024, 11 (22) : 37200 - 37213
  • [40] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176