Multi-Step Gradient Methods for Networked Optimization

被引:66
|
作者
Ghadimi, Euhanna [1 ]
Shames, Iman [2 ]
Johansson, Mikael [1 ]
机构
[1] Royal Inst Technol, ACCESS Linnaeus Ctr, S-10044 Stockholm, Sweden
[2] Univ Melbourne, Dept Elect & Elect Engn, Melbourne, Vic 3010, Australia
基金
瑞典研究理事会;
关键词
Distributed optimization; accelerated gradient methods; primal and dual decomposition; fast convergence; robustness analysis; RESOURCE-ALLOCATION;
D O I
10.1109/TSP.2013.2278149
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We develop multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients. Given the topology of the underlying network and bounds on the Hessian of the objective function, we determine the algorithm parameters that guarantee the fastest convergence and characterize situations when significant speed-ups over the standard gradient method are obtained. Furthermore, we quantify how uncertainty in problem data at design-time affects the run-time performance of the gradient method and its multi-step counterpart, and conclude that in most cases the multi-step method outperforms gradient descent. Finally, we apply the proposed technique to three engineering problems: resource allocation under network-wide budget constraint, distributed averaging, and Internet congestion control. In all cases, our proposed algorithms converge significantly faster than the state-of-the art.
引用
收藏
页码:5417 / 5429
页数:13
相关论文
共 50 条
  • [1] Multi-Step Skipping Methods for Unconstrained Optimization
    Ford, John A.
    Aamir, Nudrat
    [J]. NUMERICAL ANALYSIS AND APPLIED MATHEMATICS ICNAAM 2011: INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS, VOLS A-C, 2011, 1389
  • [2] Multi-step nonlinear conjugate gradient methods for unconstrained minimization
    John A. Ford
    Yasushi Narushima
    Hiroshi Yabe
    [J]. Computational Optimization and Applications, 2008, 40 : 191 - 216
  • [3] Multi-step nonlinear conjugate gradient methods for unconstrained minimization
    Ford, John A.
    Narushima, Yasushi
    Yabe, Hiroshi
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2008, 40 (02) : 191 - 216
  • [4] MULTI-STEP SPECTRAL GRADIENT METHODS WITH MODIFIED WEAK SECANT RELATION FOR LARGE SCALE UNCONSTRAINED OPTIMIZATION
    Sim, Hong Seng
    Leong, Wah June
    Chen, Chuei Yee
    Ibrahim, Siti Nur Iqmal
    [J]. NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2018, 8 (03): : 377 - 387
  • [5] A stochastic averaging gradient algorithm with multi-step communication for distributed optimization
    Zheng, Zuqing
    Yan, Yu
    Feng, Liping
    Du, Zhenyuan
    Li, Huaqing
    Wang, Zheng
    Hu, Jinhui
    [J]. OPTIMAL CONTROL APPLICATIONS & METHODS, 2023, 44 (04): : 2208 - 2226
  • [6] An improved multi-step gradient-type method for large scale optimization
    Farid, Mahboubeh
    Leong, Wah June
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2011, 61 (11) : 3312 - 3318
  • [7] Accelerated Gradient Methods for Networked Optimization
    Ghadimi, Euhanna
    Johansson, Mikael
    Shames, Iman
    [J]. 2011 AMERICAN CONTROL CONFERENCE, 2011,
  • [8] Linear multi-step methods and their numerical stability for solving gradient flow equations
    Qiong-Ao Huang
    Wei Jiang
    Jerry Zhijian Yang
    Gengen Zhang
    [J]. Advances in Computational Mathematics, 2023, 49
  • [9] Linear multi-step methods and their numerical stability for solving gradient flow equations
    Huang, Qiong-Ao
    Jiang, Wei
    Yang, Jerry Zhijian
    Zhang, Gengen
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2023, 49 (03)
  • [10] Alternating multi-step quasi-Newton methods for unconstrained optimization
    Ford, JA
    Moghrabi, IA
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 1997, 82 (1-2) : 105 - 116