A preconditioning proximal newton method for nondifferentiable convex optimization

被引:0
|
作者
Liqun Qi
Xiaojun Chen
机构
[1] University of New South Wales,School of Mathematics
来源
Mathematical Programming | 1997年 / 76卷
关键词
Nondifferentiable convex optimization; Proximal point; Superlinear convergence; Newton’s method;
D O I
暂无
中图分类号
学科分类号
摘要
We propose a proximal Newton method for solving nondifferentiable convex optimization. This method combines the generalized Newton method with Rockafellar’s proximal point algorithm. At each step, the proximal point is found approximately and the regularization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some accuracy and the second-order differentiability properties of the Moreau-Yosida regularization are invariant with respect to this preconditioning. Based upon these, superlinear convergence is established under a semismoothness condition.
引用
收藏
页码:411 / 429
页数:18
相关论文
共 50 条