A Global Optimization Method, QBB, for Twice-Differentiable Nonconvex Optimization Problem

被引:0
|
作者
Yushan Zhu
Takahito Kuno
机构
[1] University of Tsukuba,Institute of Information Sciences and Electronics
[2] Tsinghua University,Department of Chemical Engineering
来源
关键词
branch-and-bound algorithm; Global optimization; interval Hessian matrix; QBB; simplicial division;
D O I
暂无
中图分类号
学科分类号
摘要
A global optimization method, QBB, for twice-differentiable NLPs (Non-Linear Programming) is developed to operate within a branch-and-bound framework and require the construction of a relaxed convex problem on the basis of the quadratic lower bounding functions for the generic nonconvex structures. Within an exhaustive simplicial division of the constrained region, the rigorous quadratic underestimation function is constructed for the generic nonconvex function structure by virtue of the maximal eigenvalue analysis of the interval Hessian matrix. Each valid lower bound of the NLP problem with the division progress is computed by the convex programming of the relaxed optimization problem obtained by preserving the convex or linear terms, replacing the concave term with linear convex envelope, underestimating the special terms and the generic terms by using their customized tight convex lower bounding functions or the valid quadratic lower bounding functions, respectively. The standard convergence properties of the QBB algorithm for nonconvex global optimization problems are guaranteed. The preliminary computation studies are presented in order to evaluate the algorithmic efficiency of the proposed QBB approach.
引用
收藏
页码:435 / 464
页数:29
相关论文
共 50 条