A descent subgradient method using Mifflin's line search for nonsmooth nonconvex optimization

被引:1
|
作者
Maleknia, Morteza [1 ,2 ]
Soleimani-Damaneh, Majid [1 ]
机构
[1] Univ Tehran, Coll Sci, Sch Math Stat & Comp Sci, ICOL Ind & Computat Optimizat Lab, Tehran, Iran
[2] Isfahan Univ Technol, Dept Math Sci, Esfahan, Iran
基金
美国国家科学基金会;
关键词
Nonlinear optimization; nonsmooth optimization; nonconvex programming; subgradient; GRADIENT SAMPLING ALGORITHM; VARIABLE-METRIC METHOD; BUNDLE METHODS; CONVERGENCE;
D O I
10.1080/02331934.2024.2322152
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We propose a descent subgradient algorithm for minimizing a function $ f:\mathbb {R}<^>n\to \mathbb {R} $ f:Rn -> R, assumed to be locally Lipschitz, but not necessarily smooth or convex. To find an effective descent direction, the Goldstein epsilon-subdifferential is approximated through an iterative process. The method enjoys a new two-point variant of Mifflin's line search in which the subgradients are arbitrary. Thus, the line search procedure is easy to implement. Moreover, in comparison to bundle methods, the quadratic subproblems have a simple structure, and to handle nonconvexity the proposed method requires no algorithmic modification. We study the global convergence of the method and prove that any accumulation point of the generated sequence is Clarke stationary, assuming that the objective f is weakly upper semismooth. We illustrate the efficiency and effectiveness of the proposed algorithm on a collection of academic and semi-academic test problems.
引用
收藏
页数:27
相关论文
共 50 条