Distributed Non-Convex First-Order Optimization and Information Processing: Lower Complexity Bounds and Rate Optimal Algorithms

被引:37
|
作者
Sun, Haoran [1 ]
Hong, Mingyi [1 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55414 USA
基金
美国国家科学基金会;
关键词
Non-convex distributed optimization; Optimal convergence rate; Lower complexity bound; CONVERGENCE; SMOOTH;
D O I
10.1109/TSP.2019.2943230
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider a class of popular distributed non-convex optimization problems, in which agents connected by a network G collectively optimize a sum of smooth (possibly non-convex) local objective functions. We address the following question: if the agents can only access the gradients of local functions, what are the fastest rates that any distributed algorithms can achieve, and how to achieve those rates. First, we show that there exist difficult problem instances, such that it takes a class of distributed first-order methods at least O(1/root xi(G) x (L) over bar/epsilon) communication rounds to achieve certain epsilon-solution [where.(G) denotes the spectral gap of the graph Laplacian matrix, and (L) over bar is some Lipschitz constant]. Second, we propose (near) optimal methods whose rates match the developed lower rate bound (up to a ploylog factor). The key in the algorithm design is to properly embed the classical polynomial filtering techniques intomodern first-order algorithms. To the best of our knowledge, this is the first time that lower rate bounds and optimal methods have been developed for distributed non-convex optimization problems.
引用
收藏
页码:5912 / 5928
页数:17
相关论文
共 50 条
  • [1] Distributed Non-Convex First-Order Optimization and Information Processing: Lower Complexity Bounds and Rate Optimal Algorithms
    Sun, Haoran
    Hong, Mingyi
    [J]. 2018 CONFERENCE RECORD OF 52ND ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2018, : 38 - 42
  • [2] Towards Optimal Communication Complexity in Distributed Non-Convex Optimization
    Patel, Kumar Kshitij
    Wang, Lingxiao
    Woodworth, Blake
    Bullins, Brian
    Srebro, Nati
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] An Accelerated First-Order Method for Non-convex Optimization on Manifolds
    Christopher Criscitiello
    Nicolas Boumal
    [J]. Foundations of Computational Mathematics, 2023, 23 : 1433 - 1509
  • [4] An Accelerated First-Order Method for Non-convex Optimization on Manifolds
    Criscitiello, Christopher
    Boumal, Nicolas
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2023, 23 (04) : 1433 - 1509
  • [5] Leveraging Non-uniformity in First-order Non-convex Optimization
    Mei, Jincheng
    Gao, Yue
    Dai, Bo
    Szepesvari, Csaba
    Schuurmans, Dale
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [6] Lower bounds for non-convex stochastic optimization
    Yossi Arjevani
    Yair Carmon
    John C. Duchi
    Dylan J. Foster
    Nathan Srebro
    Blake Woodworth
    [J]. Mathematical Programming, 2023, 199 : 165 - 214
  • [7] Lower bounds for non-convex stochastic optimization
    Arjevani, Yossi
    Carmon, Yair
    Duchi, John C.
    Foster, Dylan J.
    Srebro, Nathan
    Woodworth, Blake
    [J]. MATHEMATICAL PROGRAMMING, 2023, 199 (1-2) : 165 - 214
  • [8] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Yan, Shuicheng
    Feng, Jiashi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (02) : 459 - 472
  • [9] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Feng, Jiashi
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 138 - 147
  • [10] Relaxation in non-convex optimal control problems described by first-order evolution equations
    Tolstonogov, AA
    [J]. SBORNIK MATHEMATICS, 1999, 190 (11-12) : 1689 - 1714