A globally and superlinearly convergent algorithm for nonsmooth convex minimization

被引:71
|
作者
Fukushima, M [1 ]
Qi, LQ [1 ]
机构
[1] UNIV NEW S WALES,SCH MATH,SYDNEY,NSW 2052,AUSTRALIA
关键词
nonsmooth convex optimization; Moreau-Yosida regularization; global convergence; superlinear convergence; semismoothness;
D O I
10.1137/S1052623494278839
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
It is well known that a possibly nondifferentiable convex minimization problem can be transformed into a differentiable convex minimization problem by way of the Moreau-Yosida regularization. This paper presents a globally convergent algorithm that is designed to solve the latter problem. Under additional semismoothness and regularity assumptions, the proposed algorithm is shown to have a Q-superlinear rate of convergence.
引用
收藏
页码:1106 / 1120
页数:15
相关论文
共 50 条