Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization

被引:0
|
作者
Lin, Tianyi [1 ]
Zheng, Zeyu [1 ]
Jordan, Michael I. [1 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
关键词
CONVEX-OPTIMIZATION; SUBGRADIENT METHODS; SAMPLING ALGORITHM; ZEROTH-ORDER; CONVERGENCE; COMPOSITE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonsmooth nonconvex optimization problems broadly emerge in machine learning and business decision making, whereas two core challenges impede the development of efficient solution methods with finite-time convergence guarantee: the lack of computationally tractable optimality criterion and the lack of computationally powerful oracles. The contributions of this paper are two-fold. First, we establish the relationship between the celebrated Goldstein subdifferential [46] and uniform smoothing, thereby providing the basis and intuition for the design of gradient-free methods that guarantee the finite-time convergence to a set of Goldstein stationary points. Second, we propose the gradient-free method (GFM) and stochastic GFM for solving a class of nonsmooth nonconvex optimization problems and prove that both of them can return a (delta, epsilon)-Goldstein stationary point of a Lipschitz function f at an expected convergence rate at O(d(3/2)delta(-1)epsilon(-4)) where d is the problem dimension. Two-phase versions of GFM and SGFM are also proposed and proven to achieve improved large-deviation results. Finally, we demonstrate the effectiveness of 2-SGFM on training ReLU neural networks with the MINST dataset.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization
    Huang, Feihu
    Gu, Bin
    Huo, Zhouyuan
    Chen, Songcan
    Huang, Heng
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1503 - 1510
  • [2] Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
    Chen, Lesi
    Xu, Jing
    Luo, Luo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [3] Gradient-Free Multi-Agent Nonconvex Nonsmooth Optimization
    Hajinezhad, Davood
    Zavlanos, Michael M.
    [J]. 2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 4939 - 4944
  • [4] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization∗
    Li, Zhize
    Li, Jian
    [J]. Journal of Machine Learning Research, 2022, 23
  • [5] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [6] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    [J]. arXiv, 2022,
  • [7] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    Gasnikov, A. V.
    Lagunovskaya, A. A.
    Usmanova, I. N.
    Fedorenko, F. A.
    [J]. AUTOMATION AND REMOTE CONTROL, 2016, 77 (11) : 2018 - 2034
  • [8] Parallel sequential Monte Carlo for stochastic gradient-free nonconvex optimization
    Ömer Deniz Akyildiz
    Dan Crisan
    Joaquín Míguez
    [J]. Statistics and Computing, 2020, 30 : 1645 - 1663
  • [9] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    A. V. Gasnikov
    A. A. Lagunovskaya
    I. N. Usmanova
    F. A. Fedorenko
    [J]. Automation and Remote Control, 2016, 77 : 2018 - 2034
  • [10] Parallel sequential Monte Carlo for stochastic gradient-free nonconvex optimization
    Akyildiz, Omer Deniz
    Crisan, Dan
    Miguez, Joaquin
    [J]. STATISTICS AND COMPUTING, 2020, 30 (06) : 1645 - 1663