Weak subgradient method for solving nonsmooth nonconvex optimization problems

被引:6
|
作者
Yalcin, Gulcin Dinc [1 ]
Kasimbeyli, Refail [1 ]
机构
[1] Eskisehir Tech Univ, Dept Ind Engn, Fac Engn, Eskisehir, Turkey
关键词
Subgradient; weak subgradient; nonconvex optimization; nonsmooth optimization; nonlinear optimization; solution method; GRADIENT SAMPLING ALGORITHM; MEMORY BUNDLE METHOD; CONVEX-OPTIMIZATION; RADIAL EPIDERIVATIVES; PROJECTION METHODS; CONVERGENCE; APPROXIMATE; EFFICIENCY; STEP;
D O I
10.1080/02331934.2020.1745205
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper presents a weak subgradient based method for solving nonconvex optimization problems. The method uses a weak subgradient of the objective function at a current point to generate a new one at every iteration. The concept of the weak subgradient is based on the idea of using supporting cones to the graph of a function under consideration which replaces in some sense the supporting hyperplanes used for subgradient notion of convex analysis. Because of this reason, the method developed in this paper does not require convexity assumption neither on the objective function nor on the set of feasible solutions. The new method is similar to subgradient methods of convex analysis and can be considered as a generalization of those methods. The paper investigates different stepsize parameters and provides convergence theorems for all cases. The significant difficulty of subgradient methods is an estimation of subgradients at every iteration. In this paper, a method for estimating the weak subgradients is also presented. The new method is tested on well-known test problems from the literature and computational results are reported and interpreted.
引用
收藏
页码:1513 / 1553
页数:41
相关论文
共 50 条