Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization

被引:0
|
作者
Li, Zhize [1 ]
Li, Jian [2 ]
机构
[1] Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA 15213 USA
[2] Tsinghua Univ, Inst Interdisciplinary Informat Sci, Beijing 100084, Peoples R China
关键词
nonconvex optimization; optimal algorithm; proximal gradient descent; variance re-duction; local minimum; INEQUALITIES;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose and analyze several stochastic gradient algorithms for finding stationary points or local minimum in nonconvex, possibly with nonsmooth regularizer, finite-sum and online optimization problems. First, we propose a simple proximal stochastic gradient algorithm based on variance reduction called ProxSVRG+. We provide a clean and tight analysis of ProxSVRG+, which shows that it outperforms the deterministic proximal gradient descent (ProxGD) for a wide range of mini -batch sizes, hence solves an open problem proposed in Reddi et al. (2016b). Also, ProxSVRG+ uses much less proximal oracle calls than ProxSVRG (Reddi et al., 2016b) and extends to the on-line setting by avoiding full gradient computations. Then, we further propose an optimal algorithm, called SSRGD, based on SARAH (Nguyen et al., 2017) and show that SSRGD further improves the gradient complexity of ProxSVRG+ and achieves the the optimal upper bound, matching the known lower bound of (Fang et al., 2018; Li et al., 2021). Moreover, we show that both ProxSVRG+ and SSRGD enjoy automatic adaptation with local structure of the objective function such as the Polyak-Lojasiewicz (PL) condition for nonconvex functions in the finite-sum case, i.e., we prove that both of them can automatically switch to faster global linear convergence without any restart performed in prior work ProxSVRG (Reddi et al., 2016b). Finally, we focus on the more challeng-ing problem of finding an (f, 6)-local minimum instead of just finding an epsilon-approximate (first-order) stationary point (which may be some bad unstable saddle points). We show that SSRGD can find an (f, 6)-local minimum by simply adding some random perturbations. Our algorithm is almost as simple as its counterpart for finding stationary points, and achieves similar optimal rates.
引用
收藏
页数:61
相关论文
共 50 条
  • [1] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    [J]. arXiv, 2022,
  • [2] A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
    Lin, Tianyi
    Zheng, Zeyu
    Jordan, Michael I.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Stochastic generalized gradient method for nonconvex nonsmooth stochastic optimization
    Yu. M. Ermol'ev
    V. I. Norkin
    [J]. Cybernetics and Systems Analysis, 1998, 34 : 196 - 215
  • [5] Stochastic generalized gradient method for nonconvex nonsmooth stochastic optimization
    Ermol'ev, YM
    Norkin, VI
    [J]. CYBERNETICS AND SYSTEMS ANALYSIS, 1998, 34 (02) : 196 - 215
  • [6] Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization
    Huang, Feihu
    Gu, Bin
    Huo, Zhouyuan
    Chen, Songcan
    Huang, Heng
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1503 - 1510
  • [7] VARIABLE METRIC PROXIMAL STOCHASTIC VARIANCE REDUCED GRADIENT METHODS FOR NONCONVEX NONSMOOTH OPTIMIZATION
    Yu, Tengteng
    Liu, Xin-wei
    Dai, Yu-hong
    Sun, J. I. E.
    [J]. JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2022, 18 (04) : 2611 - 2631
  • [8] Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
    Horvath, Samuel
    Lei, Lihua
    Richtarik, Peter
    Jordan, Michael I.
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 634 - 648
  • [9] Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
    Chen, Lesi
    Xu, Jing
    Luo, Luo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [10] Stochastic Generalized Gradient Methods for Training Nonconvex Nonsmooth Neural Networks
    Norkin, V. I.
    [J]. CYBERNETICS AND SYSTEMS ANALYSIS, 2021, 57 (05) : 714 - 729