Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization

被引:222
|
作者
Ghadimi, Saeed [1 ]
Lan, Guanghui [1 ]
Zhang, Hongchao [2 ]
机构
[1] Univ Florida, Dept Ind & Syst Engn, Gainesville, FL 32611 USA
[2] Louisiana State Univ, Dept Math, Baton Rouge, LA 70803 USA
基金
美国国家科学基金会;
关键词
Constrained stochastic programming; Mini-batch of samples; Stochastic approximation; Nonconvex optimization; Stochastic programming; First-order method; Zeroth-order method; CONVEX; ALGORITHMS; GRADIENT; DESCENT;
D O I
10.1007/s10107-014-0846-1
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
This paper considers a class of constrained stochastic composite optimization problems whose objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a certain non-differentiable (but convex) component. In order to solve these problems, we propose a randomized stochastic projected gradient (RSPG) algorithm, in which proper mini-batch of samples are taken at each iteration depending on the total budget of stochastic samples allowed. The RSPG algorithm also employs a general distance function to allow taking advantage of the geometry of the feasible region. Complexity of this algorithm is established in a unified setting, which shows nearly optimal complexity of the algorithm for convex stochastic programming. A post-optimization phase is also proposed to significantly reduce the variance of the solutions returned by the algorithm. In addition, based on the RSPG algorithm, a stochastic gradient free algorithm, which only uses the stochastic zeroth-order information, has been also discussed. Some preliminary numerical results are also provided.
引用
收藏
页码:267 / 305
页数:39
相关论文
共 50 条
  • [1] Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
    Saeed Ghadimi
    Guanghui Lan
    Hongchao Zhang
    [J]. Mathematical Programming, 2016, 155 : 267 - 305
  • [2] Efficient Mini-batch Training for Stochastic Optimization
    Li, Muu
    Zhang, Tong
    Chen, Yuqiang
    Smola, Alexander J.
    [J]. PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, : 661 - 670
  • [3] Mini-batch stochastic subgradient for functional constrained optimization
    Singh, Nitesh Kumar
    Necoara, Ion
    Kungurtsev, Vyacheslav
    [J]. OPTIMIZATION, 2023,
  • [4] An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
    Feyzmahdavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (12) : 3740 - 3754
  • [5] An Asynchronous Mini-batch Algorithm for Regularized Stochastic Optimization
    Feyzmandavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    [J]. 2015 54TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2015, : 1384 - 1389
  • [6] Properties of the stochastic approximation EM algorithm with mini-batch sampling
    Kuhn, Estelle
    Matias, Catherine
    Rebafka, Tabea
    [J]. STATISTICS AND COMPUTING, 2020, 30 (06) : 1725 - 1739
  • [7] Properties of the stochastic approximation EM algorithm with mini-batch sampling
    Estelle Kuhn
    Catherine Matias
    Tabea Rebafka
    [J]. Statistics and Computing, 2020, 30 : 1725 - 1739
  • [8] A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods
    Jian Gu
    Xian-Tao Xiao
    [J]. Journal of the Operations Research Society of China, 2023, 11 : 347 - 369
  • [9] A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods
    Gu, Jian
    Xiao, Xian-Tao
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2023, 11 (02) : 347 - 369
  • [10] Mini-Batch Stochastic Three-Operator Splitting for Distributed Optimization
    Franci, Barbara
    Staudigl, Mathias
    [J]. IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 2882 - 2887