On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

被引:128
|
作者
Alber, YI
Iusem, AN
Solodov, MV
机构
[1] Inst Matemat Pura & Aplicada, Jardim Bot, BR-22460320 Rio De Janeiro, Brazil
[2] Technion Israel Inst Technol, Dept Math, IL-32000 Haifa, Israel
关键词
convex optimization; nonsmooth optimization; projected gradient method; steepest descent method; weak convergence; convergence rate;
D O I
10.1007/BF01584842
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an epsilon(k)-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes alpha(k) are exogenously given, satisfying Sigma(k=0)(infinity), alpha(k) = infinity, Sigma(k=0)(infinity) alpha(k)(2) < infinity, and epsilon(k) is chosen so that epsilon(k) less than or equal to mu alpha(k) for some mu > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability of f, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results. (C) 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.
引用
下载
收藏
页码:23 / 35
页数:13
相关论文
共 50 条