CoCoA: A General Framework for Communication-Efficient Distributed Optimization

被引:0
|
作者
Smith, Virginia [1 ]
Forte, Simone [2 ]
Ma, Chenxin [3 ]
Takac, Martin [3 ]
Jordan, Michael I. [4 ,5 ]
Jaggi, Martin [6 ]
机构
[1] Stanford Univ, Dept Comp Sci, Stanford, CA 94305 USA
[2] Swiss Fed Inst Technol, Dept Comp Sci, CH-8006 Zurich, Switzerland
[3] Lehigh Univ, Ind & Syst Engn Dept, Bethlehem, PA 18015 USA
[4] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[5] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
[6] Ecole Polytech Fed Lausanne, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
基金
美国国家科学基金会; 瑞士国家科学基金会;
关键词
Convex optimization; distributed systems; large-scale machine learning; parallel and distributed algorithms;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso, sparse logistic regression, and elastic net regularization, and show how earlier work can be derived as a special case. We provide convergence guarantees for the class of convex regularized loss minimization objectives, leveraging a novel approach in handling non-strongly-convex regularizers and non-smooth loss functions. The resulting framework has markedly improved performance over state-of-the-art methods, as we illustrate with an extensive set of experiments on real distributed datasets.
引用
收藏
页数:49
相关论文
共 50 条
  • [1] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] Double Quantization for Communication-Efficient Distributed Optimization
    Huang, Longbo
    [J]. PROCEEDINGS OF THE 13TH EAI INTERNATIONAL CONFERENCE ON PERFORMANCE EVALUATION METHODOLOGIES AND TOOLS ( VALUETOOLS 2020), 2020, : 2 - 2
  • [3] Double Quantization for Communication-Efficient Distributed Optimization
    Yu, Yue
    Wu, Jiaxiang
    Huang, Longbo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Communication-Efficient Distributed Optimization with Quantized Preconditioners
    Alimisis, Foivos
    Davies, Peter
    Alistarh, Dan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Gradient Sparsification for Communication-Efficient Distributed Optimization
    Wangni, Jianqiao
    Wang, Jialei
    Liu, Ji
    Zhang, Tong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Harvesting Curvatures for Communication-Efficient Distributed Optimization
    Cardoso, Diogo
    Li, Boyue
    Chi, Yuejie
    Xavier, Joao
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 749 - 753
  • [7] Manifold Identification for Ultimately Communication-Efficient Distributed Optimization
    Li, Yu-Sheng
    Chiang, Wei-Lin
    Lee, Ching-pei
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [8] Communication-efficient distributed optimization with adaptability to system heterogeneity
    Yu, Ziyi
    Freris, Nikolaos M.
    [J]. 2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 3321 - 3326
  • [9] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Adaptive Bit Allocation for Communication-Efficient Distributed Optimization
    Reisizadeh, Hadi
    Touri, Behrouz
    Mohajer, Soheil
    [J]. 2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 1994 - 2001