General Convergence Analysis of Stochastic First-Order Methods for Composite Optimization

被引:7
|
作者
Necoara, Ion [1 ,2 ]
机构
[1] Univ Politehn Bucuresti, Dept Automat Control & Syst Engn, Bucharest 060042, Romania
[2] Romanian Acad, Inst Math Stat & Appl Math, Bucharest 050711, Romania
关键词
Stochastic composite convex optimization; Stochastic bounded gradient; Quadratic functional growth; Stochastic first-order algorithms; Convergence rates;
D O I
10.1007/s10957-021-01821-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we consider stochastic composite convex optimization problems with the objective function satisfying a stochastic bounded gradient condition, with or without a quadratic functional growth property. These models include the most well-known classes of objective functions analyzed in the literature: nonsmooth Lipschitz functions and composition of a (potentially) nonsmooth function and a smooth function, with or without strong convexity. Based on the flexibility offered by our optimization model, we consider several variants of stochastic first-order methods, such as the stochastic proximal gradient and the stochastic proximal point algorithms. Usually, the convergence theory for these methods has been derived for simple stochastic optimization models satisfying restrictive assumptions, and the rates are in general sublinear and hold only for specific decreasing stepsizes. Hence, we analyze the convergence rates of stochastic first-order methods with constant or variable stepsize under general assumptions covering a large class of objective functions. For constant stepsize, we show that these methods can achieve linear convergence rate up to a constant proportional to the stepsize and under some strong stochastic bounded gradient condition even pure linear convergence. Moreover, when a variable stepsize is chosen we derive sublinear convergence rates for these stochastic first-order methods. Finally, the stochastic gradient mapping and the Moreau smoothing mapping introduced in the present paper lead to simple and intuitive proofs.
引用
收藏
页码:66 / 95
页数:30
相关论文
共 50 条
  • [1] General Convergence Analysis of Stochastic First-Order Methods for Composite Optimization
    Ion Necoara
    [J]. Journal of Optimization Theory and Applications, 2021, 189 : 66 - 95
  • [2] A General Framework for Decentralized Optimization With First-Order Methods
    Xin, Ran
    Pu, Shi
    Nedic, Angelia
    Khan, Usman A.
    [J]. PROCEEDINGS OF THE IEEE, 2020, 108 (11) : 1869 - 1889
  • [3] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Katya Scheinberg
    Donald Goldfarb
    Xi Bai
    [J]. Foundations of Computational Mathematics, 2014, 14 : 389 - 417
  • [4] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Scheinberg, Katya
    Goldfarb, Donald
    Bai, Xi
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2014, 14 (03) : 389 - 417
  • [5] Convergence of First-Order Methods for Constrained Nonconvex Optimization with Dependent Data
    Alacaoglu, Ahmet
    Lyu, Hanbaek
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202 : 458 - 489
  • [6] Convergence Analysis of Primal Solutions in Dual First-order Methods
    Lu, Jie
    Johansson, Mikael
    [J]. 2013 IEEE 52ND ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2013, : 6861 - 6867
  • [7] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Digvijay Boob
    Qi Deng
    Guanghui Lan
    [J]. Mathematical Programming, 2023, 197 : 215 - 279
  • [8] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Boob, Digvijay
    Deng, Qi
    Lan, Guanghui
    [J]. MATHEMATICAL PROGRAMMING, 2023, 197 (01) : 215 - 279
  • [9] First-Order Methods for Convex Optimization
    Dvurechensky, Pavel
    Shtern, Shimrit
    Staudigl, Mathias
    [J]. EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION, 2021, 9
  • [10] CONVERGENCE ANALYSIS OF APPROXIMATE PRIMAL SOLUTIONS IN DUAL FIRST-ORDER METHODS
    Lu, Jie
    Johansson, Mikael
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (04) : 2430 - 2467