Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials

被引:0
|
作者
Pham Duy Khanh
Boris S. Mordukhovich
Vo Thanh Phat
Dat Ba Tran
机构
[1] HCMC University of Education,Department of Mathematics
[2] Wayne State University,Department of Mathematics
来源
关键词
Variational analysis and nonsmooth optimization; Damped Newton methods; Global convergence; Tilt stability of minimizers; Superlinear convergence; Lasso problems; Primary: 49J52; 49J53; secondary: 90C30; 90C53;
D O I
暂无
中图分类号
学科分类号
摘要
The paper proposes and develops new globally convergent algorithms of the generalized damped Newton type for solving important classes of nonsmooth optimization problems. These algorithms are based on the theory and calculations of second-order subdifferentials of nonsmooth functions with employing the machinery of second-order variational analysis and generalized differentiation. First we develop a globally superlinearly convergent damped Newton-type algorithm for the class of continuously differentiable functions with Lipschitzian gradients, which are nonsmooth of second order. Then we design such a globally convergent algorithm to solve a structured class of nonsmooth quadratic composite problems with extended-real-valued cost functions, which typically arise in machine learning and statistics. Finally, we present the results of numerical experiments and compare the performance of our main algorithm applied to an important class of Lasso problems with those achieved by other first-order and second-order optimization algorithms.
引用
收藏
页码:93 / 122
页数:29
相关论文
共 50 条