Communication-Efficient Distributed Minimax Optimization via Markov Compression

被引:0
|
作者
Yang, Linfeng [1 ]
Zhang, Zhen [1 ]
Che, Keqin [1 ]
Yang, Shaofu [1 ]
Wang, Suyang [2 ]
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing, Peoples R China
[2] Jiangsu Jinheng Informat Technol Co Ltd, Wujin, Peoples R China
基金
中国国家自然科学基金;
关键词
Minimax optimization; Parameter-server framework; Communication compression; Distributed optimization;
D O I
10.1007/978-981-99-8079-6_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, the minimax problem has attracted a lot of attention due to its wide applications in modern machine learning fields such as GANs. With the exponential growth of data volumes and increasing problem sizes, the design of distributed algorithms to train high-performance models has become imperative. However, distributed algorithms often suffer from communication bottlenecks. To address this challenge, in this paper, we propose a communication-efficient distributed compressed stochastic gradient descent ascent algorithm, abbreviated as DCSGDA, in a parameter-server setting. To reduce the communication cost, each client in DCSGDA transmits the compressed gradients of the primal and dual variables to the server at each iteration. In particular, we leverage a Markov compression mechanism that allows both unbiased and biased compressors to mitigate the negative effect of compression errors on convergence. Namely, we show theoretically that the DCSGDA algorithm can still achieve linear convergence in the presence of compression errors, provided that the local objective function is strongly-convex-strongly-concave. Finally, numerical experiments demonstrate the desirable communication efficiency and efficacy of the proposed DCSGDA.
引用
收藏
页码:540 / 551
页数:12
相关论文
共 50 条
  • [1] Innovation Compression for Communication-Efficient Distributed Optimization With Linear Convergence
    Zhang, Jiaqi
    You, Keyou
    Xie, Lihua
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (11) : 6899 - 6906
  • [2] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Double Quantization for Communication-Efficient Distributed Optimization
    Huang, Longbo
    [J]. PROCEEDINGS OF THE 13TH EAI INTERNATIONAL CONFERENCE ON PERFORMANCE EVALUATION METHODOLOGIES AND TOOLS ( VALUETOOLS 2020), 2020, : 2 - 2
  • [4] Double Quantization for Communication-Efficient Distributed Optimization
    Yu, Yue
    Wu, Jiaxiang
    Huang, Longbo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [5] Communication-Efficient Distributed Optimization with Quantized Preconditioners
    Alimisis, Foivos
    Davies, Peter
    Alistarh, Dan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [6] Gradient Sparsification for Communication-Efficient Distributed Optimization
    Wangni, Jianqiao
    Wang, Jialei
    Liu, Ji
    Zhang, Tong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] Harvesting Curvatures for Communication-Efficient Distributed Optimization
    Cardoso, Diogo
    Li, Boyue
    Chi, Yuejie
    Xavier, Joao
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 749 - 753
  • [8] Manifold Identification for Ultimately Communication-Efficient Distributed Optimization
    Li, Yu-Sheng
    Chiang, Wei-Lin
    Lee, Ching-pei
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [9] GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training
    Tyagi, Sahil
    Swany, Martin
    [J]. 2023 IEEE 16TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING, CLOUD, 2023, : 319 - 329
  • [10] Communication-efficient distributed optimization with adaptability to system heterogeneity
    Yu, Ziyi
    Freris, Nikolaos M.
    [J]. 2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 3321 - 3326