First-order methods of smooth convex optimization with inexact oracle

被引:261
|
作者
Devolder, Olivier [1 ]
Glineur, Francois [1 ]
Nesterov, Yurii [1 ]
机构
[1] Catholic Univ Louvain, ICTEAM Inst CORE, B-1348 Louvain, Belgium
关键词
Smooth convex optimization; First-order methods; Inexact oracle; Gradient methods; Fast gradient methods; Complexity bounds; PROXIMAL BUNDLE METHOD;
D O I
10.1007/s10107-013-0677-5
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We introduce the notion of inexact first-order oracle and analyze the behavior of several first-order methods of smooth convex optimization used with such an oracle. This notion of inexact oracle naturally appears in the context of smoothing techniques, Moreau-Yosida regularization, Augmented Lagrangians and many other situations. We derive complexity estimates for primal, dual and fast gradient methods, and study in particular their dependence on the accuracy of the oracle and the desired accuracy of the objective function. We observe that the superiority of fast gradient methods over the classical ones is no longer absolute when an inexact oracle is used. We prove that, contrary to simple gradient schemes, fast gradient methods must necessarily suffer from error accumulation. Finally, we show that the notion of inexact oracle allows the application of first-order methods of smooth convex optimization to solve non-smooth or weakly smooth convex problems.
引用
收藏
页码:37 / 75
页数:39
相关论文
共 50 条
  • [21] Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds
    Liu, Yuanyuan
    Shang, Fanhua
    Cheng, James
    Cheng, Hong
    Jiao, Licheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [22] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Boob, Digvijay
    Deng, Qi
    Lan, Guanghui
    MATHEMATICAL PROGRAMMING, 2023, 197 (01) : 215 - 279
  • [23] From differential equation solvers to accelerated first-order methods for convex optimization
    Hao Luo
    Long Chen
    Mathematical Programming, 2022, 195 : 735 - 781
  • [24] Adaptive First-Order Methods Revisited: Convex Optimization without Lipschitz Requirements
    Antonakopoulos, Kimon
    Mertikopoulos, Panayotis
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] ACCELERATED FIRST-ORDER METHODS FOR CONVEX OPTIMIZATION WITH LOCALLY LIPSCHITZ CONTINUOUS GRADIENT
    Lu, Zhaosong
    Mei, Sanyou
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 2275 - 2310
  • [26] From differential equation solvers to accelerated first-order methods for convex optimization
    Luo, Hao
    Chen, Long
    MATHEMATICAL PROGRAMMING, 2022, 195 (1-2) : 735 - 781
  • [27] FOM - a MATLAB toolbox of first-order methods for solving convex optimization problems
    Beck, Amir
    Guttmann-Beck, Nili
    OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (01): : 172 - 193
  • [28] The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization
    Kovalev, Dmitry
    Gasnikov, Alexander
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [29] Online First-Order Framework for Robust Convex Optimization
    Ho-Nguyen, Nam
    Kilinc-Karzan, Fatma
    OPERATIONS RESEARCH, 2018, 66 (06) : 1670 - 1692
  • [30] An adaptive accelerated first-order method for convex optimization
    Renato D. C. Monteiro
    Camilo Ortiz
    Benar F. Svaiter
    Computational Optimization and Applications, 2016, 64 : 31 - 73