Combination of steepest descent and BFGS methods for nonconvex nonsmooth optimization

被引:0
|
作者
Rohollah Yousefpour
机构
[1] University of Mazandaran,Department of Mathematical Sciences
来源
Numerical Algorithms | 2016年 / 72卷
关键词
Lipschitz functions; Wolfe conditions; Nonsmooth line search method; Nonsmooth BFGS method; 49J52; 90C26;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, a method is developed for solving nonsmooth nonconvex minimization problems. This method extends the classical BFGS framework. First, we generalize the Wolfe conditions for locally Lipschitz functions and prove that this generalization is well defined. Then, a line search algorithm is presented to find a step length satisfying the generalized Wolfe conditions. Next, the Goldstein e-subgradient is approximated by an iterative method and a descent direction is computed using a positive definite matrix. This matrix is updated using the BFGS method. Finally, a minimization algorithm based on the BFGS method is described. The algorithm is implemented in MATLAB and numerical results using it are reported.
引用
收藏
页码:57 / 90
页数:33
相关论文
共 50 条
  • [21] Nonsmooth Steepest Descent Method by Proximal Subdifferentials in Hilbert Spaces
    Wei, Zhou
    He, Qing Hai
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2014, 161 (02) : 465 - 477
  • [22] STOCHASTIC BLOCK MIRROR DESCENT METHODS FOR NONSMOOTH AND STOCHASTIC OPTIMIZATION
    Dang, Cong D.
    Lan, Guanghui
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (02) : 856 - 881
  • [23] OPTIMIZATION OF COMPLEX CHEMICAL PROCESSES - COMPARISON OF VARIATIONAL AND STEEPEST DESCENT METHODS
    KUO, MT
    RUBIN, DI
    WRIGHT, BS
    INDUSTRIAL & ENGINEERING CHEMISTRY PROCESS DESIGN AND DEVELOPMENT, 1966, 5 (04): : 404 - &
  • [24] Nonconvex and Nonsmooth Sparse Optimization via Adaptively Iterative Reweighted Methods
    Hao Wang
    Fan Zhang
    Yuanming Shi
    Yaohua Hu
    Journal of Global Optimization, 2021, 81 : 717 - 748
  • [25] Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
    Lin, Tianyi
    Zheng, Zeyu
    Jordan, Michael I.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [26] A steepest descent method for vector optimization
    Drummond, LMG
    Svaiter, BF
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2005, 175 (02) : 395 - 414
  • [27] Proximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization
    Reddi, Sashank J.
    Sra, Suvrit
    Poczos, Barnabas
    Smola, Alexander J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [28] Nonconvex and Nonsmooth Sparse Optimization via Adaptively Iterative Reweighted Methods
    Wang, Hao
    Zhang, Fan
    Shi, Yuanming
    Hu, Yaohua
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 81 (03) : 717 - 748
  • [29] A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles
    Curtis, Frank E.
    Mitchell, Tim
    Overton, Michael L.
    OPTIMIZATION METHODS & SOFTWARE, 2017, 32 (01): : 148 - 181
  • [30] Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems
    Guminov, S. V.
    Nesterov, Yu. E.
    Dvurechensky, P. E.
    Gasnikov, A. V.
    DOKLADY MATHEMATICS, 2019, 99 (02) : 125 - 128