A convergent decomposition method for box-constrained optimization problems

被引:0
|
作者
Andrea Cassioli
Marco Sciandrone
机构
[1] Università degli Studi di Firenze,Dipartimento di Sistemi e Informatica
来源
Optimization Letters | 2009年 / 3卷
关键词
Decomposition methods; Gauss–Southwell method; Global convergence;
D O I
暂无
中图分类号
学科分类号
摘要
In this work we consider the problem of minimizing a continuously differentiable function over a feasible set defined by box constraints. We present a decomposition method based on the solution of a sequence of subproblems. In particular, we state conditions on the rule for selecting the subproblem variables sufficient to ensure the global convergence of the generated sequence without convexity assumptions. The conditions require to select suitable variables (related to the violation of the optimality conditions) to guarantee theoretical convergence properties, and leave the degree of freedom of selecting any other group of variables to accelerate the convergence.
引用
收藏
页码:397 / 409
页数:12
相关论文
共 50 条