Performance of noisy Nesterov's accelerated method for strongly convex optimization problems

被引:8
|
作者
Mohammadi, Hesameddin [1 ]
Razaviyayn, Meisam [2 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
[2] Univ Southern Calif, Dept Ind & Syst Engn, Los Angeles, CA 90089 USA
基金
美国国家科学基金会;
关键词
Accelerated first-order algorithms; control for optimization; convex optimization; integral quadratic constraints; linear matrix inequalities; Nesterov's method; noise amplification; second-order moments; semidefinite programming; GRADIENT; ALGORITHMS;
D O I
10.23919/acc.2019.8814680
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the performance of noisy gradient descent and Nesterov's accelerated methods for strongly convex objective functions with Lipschitz continuous gradients. The steady-state second -order moment of the error in the iterates is analyzed when the gradient is perturbed by an additive white noise with zero mean and identity covariance. For any given condition number kappa, we derive explicit upper bounds on noise amplification that only depend on kappa and the problem size. We use quadratic objective functions to derive lower bounds and to demonstrate that the upper bounds are tight up to a constant factor. The established upper bound for Nesterov's accelerated method is larger than the upper bound for gradient descent by a factor of root kappa. This gap identifies a fundamental tradeoff that comes with acceleration in the presence of stochastic uncertainties in the gradient evaluation.
引用
收藏
页码:3426 / 3431
页数:6
相关论文
共 50 条
  • [31] Accelerated Stochastic Variance Reduction for a Class of Convex Optimization Problems
    He, Lulu
    Ye, Jimin
    Jianwei, E.
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (03) : 810 - 828
  • [32] Accelerated methods for weakly-quasi-convex optimization problems
    Guminov, Sergey
    Gasnikov, Alexander
    Kuruzov, Ilya
    [J]. COMPUTATIONAL MANAGEMENT SCIENCE, 2023, 20 (01)
  • [33] Accelerated Stochastic Variance Reduction for a Class of Convex Optimization Problems
    Lulu He
    Jimin Ye
    E. Jianwei
    [J]. Journal of Optimization Theory and Applications, 2023, 196 : 810 - 828
  • [34] Accelerated methods for weakly-quasi-convex optimization problems
    Sergey Guminov
    Alexander Gasnikov
    Ilya Kuruzov
    [J]. Computational Management Science, 2023, 20
  • [35] Projection-free accelerated method for convex optimization
    Goncalves, Max L. N.
    Melo, Jefferson G.
    Monteiro, Renato D. C.
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2022, 37 (01): : 214 - 240
  • [36] Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints
    Stonyakin F.S.
    Alkousa M.
    Stepanov A.N.
    Titov A.A.
    [J]. Journal of Applied and Industrial Mathematics, 2019, 13 (03) : 557 - 574
  • [37] Projection-free accelerated method for convex optimization
    Gonçalves, Max L. N.
    Melo, Jefferson G.
    Monteiro, Renato D. C.
    [J]. Optimization Methods and Software, 2022, 37 (01) : 214 - 240
  • [38] The regularized stochastic Nesterov?s accelerated Quasi-Newton method with applications
    Makmuang, Dawrawee
    Suppalap, Siwakon
    Wangkeeree, Rabian
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2023, 428
  • [39] Implementation of a modified Nesterov's Accelerated quasi-Newton Method on Tensorflow
    Indrapriyadarsini, S.
    Mahboubi, Shahrzad
    Ninomiya, Hiroshi
    Asai, Hideki
    [J]. 2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 1147 - 1154
  • [40] Improving Neural Ordinary Differential Equations with Nesterov's Accelerated Gradient Method
    Nguyen, Nghia H.
    Nguyen, Tan M.
    Vo, Huyen K.
    Osher, Stanley J.
    Vo, Thieu N.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,