Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping

被引:52
|
作者
Iiduka, Hideaki [1 ]
机构
[1] Meiji Univ, Dept Comp Sci, Tama Ku, Kawasaki, Kanagawa 2148571, Japan
基金
日本学术振兴会;
关键词
Convex optimization; Fixed point set; Nonexpansive mapping; Conjugate gradient method; Three-term conjugate gradient method; Fixed point optimization algorithm; CONJUGATE-GRADIENT METHOD; GLOBAL CONVERGENCE; ALGORITHMS; SEQUENCE;
D O I
10.1007/s10107-013-0741-1
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The existing algorithms for solving the convex minimization problem over the fixed point set of a nonexpansive mapping on a Hilbert space are based on algorithmic methods, such as the steepest descent method and conjugate gradient methods, for finding a minimizer of the objective function over the whole space, and attach importance to minimizing the objective function as quickly as possible. Meanwhile, it is of practical importance to devise algorithms which converge in the fixed point set quickly because the fixed point set is the set with the constraint conditions that must be satisfied in the problem. This paper proposes an algorithm which not only minimizes the objective function quickly but also converges in the fixed point set much faster than the existing algorithms and proves that the algorithm with diminishing step-size sequences strongly converges to the solution to the convex minimization problem. We also analyze the proposed algorithm with each of the Fletcher-Reeves, Polak-Ribi,re-Polyak, Hestenes-Stiefel, and Dai-Yuan formulas used in the conventional conjugate gradient methods, and show that there is an inconvenient possibility that their algorithms may not converge to the solution to the convex minimization problem. We numerically compare the proposed algorithm with the existing algorithms and show its effectiveness and fast convergence.
引用
收藏
页码:131 / 165
页数:35
相关论文
共 50 条