Harnessing Smoothness to Accelerate Distributed Optimization

被引:402
|
作者
Qu, Guannan [1 ]
Li, Na [1 ]
机构
[1] Harvard Univ, Sch Engn & Appl Sci, Cambridge, MA 02138 USA
来源
关键词
Distributed algorithms; multiagent systems; optimization methods; SUBGRADIENT METHODS; CONSENSUS;
D O I
10.1109/TCNS.2017.2698261
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
There has been a growing effort in studying the distributed optimization problem over a network. The objective is to optimize a global function formed by a sum of local functions, using only local computation and communication. The literature has developed consensus-based distributed (sub) gradient descent (DGD) methods and has shown that they have the same convergence rate O(log t/root t) as the centralized (sub) gradient methods (CGD), when the function is convex but possibly nonsmooth. However, when the function is convex and smooth, under the framework of DGD, it is unclear how to harness the smoothness to obtain a faster convergence rate comparable to CGD's convergence rate. In this paper, we propose a distributed algorithm that, despite using the same amount of communication per iteration as DGD, can effectively harnesses the function smoothness and converge to the optimum with a rate of O(1/t). If the objective function is further strongly convex, our algorithm has a linear convergence rate. Both rates match the convergence rate of CGD. The key step in our algorithm is a novel gradient estimation scheme that uses history information to achieve fast and accurate estimation of the average gradient. To motivate the necessity of history information, we also show that it is impossible for a class of distributed algorithms like DGDto achieve a linear convergence rate without using history information even if the objective function is strongly convex and smooth.
引用
收藏
页码:1245 / 1260
页数:16
相关论文
共 50 条
  • [31] Smoothness preserving layout for dynamic labels by hybrid optimization
    Yu He
    Guo-Dong Zhao
    Song-Hai Zhang
    Computational Visual Media, 2022, 8 (01) : 149 - 163
  • [32] Orthogonal test optimization of vehicle smoothness and road friendliness
    1600, Centre for Environment Social and Economic Research, Post Box No. 113, Roorkee, 247667, India (51):
  • [33] Smoothness preserving layout for dynamic labels by hybrid optimization
    He, Yu
    Zhao, Guo-Dong
    Zhang, Song-Hai
    COMPUTATIONAL VISUAL MEDIA, 2022, 8 (01) : 149 - 163
  • [34] Smoothness preserving layout for dynamic labels by hybrid optimization
    Yu He
    Guo-Dong Zhao
    Song-Hai Zhang
    Computational Visual Media, 2022, 8 : 149 - 163
  • [35] Computational Boundary Sampling to Accelerate IMRT Optimization
    Tiwari, P.
    Xie, Y.
    Chen, Y.
    Apte, A.
    Deasy, J.
    MEDICAL PHYSICS, 2012, 39 (06) : 3848 - 3849
  • [36] DISTRIBUTED LAG ESTIMATOR DERIVED FROM SHILLER SMOOTHNESS PRIORS - EXTENSION
    ULLAH, A
    RAJ, B
    ECONOMICS LETTERS, 1979, 2 (03) : 219 - 223
  • [37] An accelerated distributed method with inexact model of relative smoothness and strong convexity
    Zhang, Xuexue
    Liu, Sanyang
    Zhao, Nannan
    IET SIGNAL PROCESSING, 2023, 17 (04)
  • [38] Accelerate Convergence Rate of Distributed Consensus Algorithm with Optimized Topology
    Peng Huanxin
    Wang Wenkai
    Liu Bin
    SENSORS, MECHATRONICS AND AUTOMATION, 2014, 511-512 : 950 - 953
  • [39] Distributed ledgers in transfusion medicine: an opportunity for standards to accelerate innovation
    Gniadek, Thomas J.
    Ball, Peter A.
    TRANSFUSION, 2018, 58 (06) : 1567 - 1567
  • [40] Co-designing the Topology/Algorithm to Accelerate Distributed Training
    Hou, Xiang
    Xu, Rui
    Ma, Sheng
    Wang, Qiong
    Jiang, Wei
    Lu, Hongyi
    19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, : 1010 - 1018