Performance of noisy Nesterov's accelerated method for strongly convex optimization problems

被引:8
|
作者
Mohammadi, Hesameddin [1 ]
Razaviyayn, Meisam [2 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
[2] Univ Southern Calif, Dept Ind & Syst Engn, Los Angeles, CA 90089 USA
基金
美国国家科学基金会;
关键词
Accelerated first-order algorithms; control for optimization; convex optimization; integral quadratic constraints; linear matrix inequalities; Nesterov's method; noise amplification; second-order moments; semidefinite programming; GRADIENT; ALGORITHMS;
D O I
10.23919/acc.2019.8814680
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the performance of noisy gradient descent and Nesterov's accelerated methods for strongly convex objective functions with Lipschitz continuous gradients. The steady-state second -order moment of the error in the iterates is analyzed when the gradient is perturbed by an additive white noise with zero mean and identity covariance. For any given condition number kappa, we derive explicit upper bounds on noise amplification that only depend on kappa and the problem size. We use quadratic objective functions to derive lower bounds and to demonstrate that the upper bounds are tight up to a constant factor. The established upper bound for Nesterov's accelerated method is larger than the upper bound for gradient descent by a factor of root kappa. This gap identifies a fundamental tradeoff that comes with acceleration in the presence of stochastic uncertainties in the gradient evaluation.
引用
收藏
页码:3426 / 3431
页数:6
相关论文
共 50 条
  • [1] On the transient growth of Nesterov's accelerated method for strongly convex optimization problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    [J]. 2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 5911 - 5916
  • [2] Nesterov's Method for Convex Optimization
    Walkington, Noel J.
    [J]. SIAM REVIEW, 2023, 65 (02) : 539 - 562
  • [3] Nesterov Accelerated Shuffling Gradient Method for Convex Optimization
    Tran, Trang H.
    Scheinberg, Katya
    Nguyen, Lam M.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [4] Performance of noisy higher-order accelerated gradient flow dynamics for strongly convex quadratic optimization problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    [J]. 2023 AMERICAN CONTROL CONFERENCE, ACC, 2023, : 3839 - 3844
  • [5] Performance of noisy three-step accelerated first-order optimization algorithms for strongly convex quadratic problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    [J]. 2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 1300 - 1305
  • [6] A Stochastic Nesterov’s Smoothing Accelerated Method for General Nonsmooth Constrained Stochastic Composite Convex Optimization
    Ruyu Wang
    Chao Zhang
    Lichun Wang
    Yuanhai Shao
    [J]. Journal of Scientific Computing, 2022, 93
  • [7] A Stochastic Nesterov's Smoothing Accelerated Method for General Nonsmooth Constrained Stochastic Composite Convex Optimization
    Wang, Ruyu
    Zhang, Chao
    Wang, Lichun
    Shao, Yuanhai
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2022, 93 (02)
  • [8] Uniting Nesterov's Accelerated Gradient Descent and the Heavy Ball Method for Strongly Convex Functions with Exponential Convergence Rate
    Hustig-Schultz, Dawn M.
    Sanfelice, Ricardo G.
    [J]. 2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 959 - 964
  • [9] Accelerated Distributed Nesterov Gradient Descent for Smooth and Strongly Convex Functions
    Qu, Guannan
    Li, Na
    [J]. 2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2016, : 209 - 216
  • [10] Nesterov's accelerated gradient method for nonlinear ill-posed problems with a locally convex residual functional
    Hubmer, Simon
    Ramlau, Ronny
    [J]. INVERSE PROBLEMS, 2018, 34 (09)