STOCHASTIC FIRST- AND ZEROTH-ORDER METHODS FOR NONCONVEX STOCHASTIC PROGRAMMING

被引:721
|
作者
Ghadimi, Saeed [1 ]
Lan, Guanghui [1 ]
机构
[1] Univ Florida, Dept Ind & Syst Engn, Gainesville, FL 32611 USA
基金
美国国家科学基金会;
关键词
stochastic approximation; nonconvex optimization; stochastic programming; simulation-based optimization; WORST-CASE COMPLEXITY; APPROXIMATION ALGORITHMS; COMPOSITE OPTIMIZATION; GRADIENT;
D O I
10.1137/120880811
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we introduce a new stochastic approximation type algorithm, namely, the randomized stochastic gradient (RSG) method, for solving an important class of nonlinear (possibly nonconvex) stochastic programming problems. We establish the complexity of this method for computing an approximate stationary point of a nonlinear programming problem. We also show that this method possesses a nearly optimal rate of convergence if the problem is convex. We discuss a variant of the algorithm which consists of applying a postoptimization phase to evaluate a short list of solutions generated by several independent runs of the RSG method, and we show that such modification allows us to improve significantly the large-deviation properties of the algorithm. These methods are then specialized for solving a class of simulation-based optimization problems in which only stochastic zeroth-order information is available.
引用
收藏
页码:2341 / 2368
页数:28
相关论文
共 50 条
  • [1] Zeroth-order algorithms for stochastic distributed nonconvex optimization
    Yi, Xinlei
    Zhang, Shengjun
    Yang, Tao
    Johansson, Karl H.
    [J]. AUTOMATICA, 2022, 142
  • [2] Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization
    Liu, Sijia
    Kailkhura, Bhavya
    Chen, Pin-Yu
    Ting, Paishun
    Chang, Shiyu
    Amini, Lisa
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] ZEROTH-ORDER STOCHASTIC PROJECTED GRADIENT DESCENT FOR NONCONVEX OPTIMIZATION
    Liu, Sijia
    Li, Xingguo
    Chen, Pin-Yu
    Haupt, Jarvis
    Amini, Lisa
    [J]. 2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 1179 - 1183
  • [4] Convergence Analysis of Nonconvex Distributed Stochastic Zeroth-order Coordinate Method
    Zhang, Shengjun
    Dong, Yunlong
    Xie, Dong
    Yao, Lisha
    Bailey, Colleen P.
    Fu, Shengli
    [J]. 2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 1180 - 1185
  • [5] Zeroth-Order Stochastic Alternating Direction Method of Multipliers for Nonconvex Nonsmooth Optimization
    Huang, Feihu
    Gao, Shangqian
    Chen, Songcan
    Huang, Heng
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2549 - 2555
  • [6] Fast automatic step size selection for zeroth-order nonconvex stochastic optimization
    Yang, Zhuang
    [J]. Expert Systems with Applications, 2021, 174
  • [7] Fast automatic step size selection for zeroth-order nonconvex stochastic optimization
    Yang, Zhuang
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 174
  • [8] Zeroth-Order Nonconvex Stochastic Optimization: Handling Constraints, High Dimensionality, and Saddle Points
    Balasubramanian, Krishnakumar
    Ghadimi, Saeed
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2022, 22 (01) : 35 - 76
  • [9] Zeroth-Order Nonconvex Stochastic Optimization: Handling Constraints, High Dimensionality, and Saddle Points
    Krishnakumar Balasubramanian
    Saeed Ghadimi
    [J]. Foundations of Computational Mathematics, 2022, 22 : 35 - 76
  • [10] Stochastic Zeroth-order Optimization in High Dimensions
    Wang, Yining
    Du, Simon S.
    Balakrishnan, Sivaraman
    Singh, Aarti
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84