Inexact proximal stochastic gradient method for convex composite optimization

被引:8
|
作者
Wang, Xiao [1 ]
Wang, Shuxiong [2 ]
Zhang, Hongchao [3 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, 19A Yuquan Rd, Beijing 100049, Peoples R China
[2] Chinese Acad Sci, Inst Computat Math & Sci Engn Comp, Acad Math & Syst Sci, Beijing 100190, Peoples R China
[3] Louisiana State Univ, Dept Math, 220 Lockett Hall, Baton Rouge, LA 70803 USA
基金
美国国家科学基金会; 中国国家自然科学基金;
关键词
Convex composite optimization; Empirical risk minimization; Stochastic gradient; Inexact methods; Global convergence; Complexity bound; APPROXIMATION ALGORITHMS; THRESHOLDING ALGORITHM;
D O I
10.1007/s10589-017-9932-7
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We study an inexact proximal stochastic gradient (IPSG) method for convex composite optimization, whose objective function is a summation of an average of a large number of smooth convex functions and a convex, but possibly nonsmooth, function. Variance reduction techniques are incorporated in the method to reduce the stochastic gradient variance. The main feature of this IPSG algorithm is to allow solving the proximal subproblems inexactly while still keeping the global convergence with desirable complexity bounds. Different subproblem stopping criteria are proposed. Global convergence and the component gradient complexity bounds are derived for the both cases when the objective function is strongly convex or just generally convex. Preliminary numerical experiment shows the overall efficiency of the IPSG algorithm.
引用
收藏
页码:579 / 618
页数:40
相关论文
共 50 条
  • [1] Inexact proximal stochastic gradient method for convex composite optimization
    Xiao Wang
    Shuxiong Wang
    Hongchao Zhang
    [J]. Computational Optimization and Applications, 2017, 68 : 579 - 618
  • [2] Distributed and Inexact Proximal Gradient Method for Online Convex Optimization
    Bastianello, Nicola
    Dall'Anese, Emiliano
    [J]. 2021 EUROPEAN CONTROL CONFERENCE (ECC), 2021, : 2432 - 2437
  • [3] Decentralized Inexact Proximal Gradient Method With Network-Independent Stepsizes for Convex Composite Optimization
    Guo, Luyao
    Shi, Xinli
    Cao, Jinde
    Wang, Zihao
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 786 - 801
  • [4] A NOTE ON THE (ACCELERATED) PROXIMAL GRADIENT METHOD FOR COMPOSITE CONVEX OPTIMIZATION
    Li, Qingjing
    Tan, Li
    Guo, Ke
    [J]. JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2022, 23 (12) : 2847 - 2857
  • [5] An Asynchronous Distributed Proximal Gradient Method for Composite Convex Optimization
    Aybat, N. S.
    Wang, Z.
    Iyengar, G.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 2454 - 2462
  • [6] Inexact Online Proximal-gradient Method for Time-varying Convex Optimization
    Ajalloeian, Amirhossein
    Simonetto, Andrea
    Dall'Anese, Emiliano
    [J]. 2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 2850 - 2857
  • [7] Inexact proximal ε-subgradient methods for composite convex optimization problems
    Millan, R. Diaz
    Machado, M. Penton
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2019, 75 (04) : 1029 - 1060
  • [8] Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle
    Pavel Dvurechensky
    Alexander Gasnikov
    [J]. Journal of Optimization Theory and Applications, 2016, 171 : 121 - 145
  • [9] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    Gasnikov, A. V.
    Lagunovskaya, A. A.
    Usmanova, I. N.
    Fedorenko, F. A.
    [J]. AUTOMATION AND REMOTE CONTROL, 2016, 77 (11) : 2018 - 2034
  • [10] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    A. V. Gasnikov
    A. A. Lagunovskaya
    I. N. Usmanova
    F. A. Fedorenko
    [J]. Automation and Remote Control, 2016, 77 : 2018 - 2034