A New Class of Improved Convex Underestimators for Twice Continuously Differentiable Constrained NLPs

被引:0
|
作者
Ioannis G. Akrotirianakis
Christodoulos A. Floudas
机构
[1] Princeton University,Department of Chemical Engineering
来源
关键词
αBB; convex underestimators; global optimization;
D O I
暂无
中图分类号
学科分类号
摘要
We present a new class of convex underestimators for arbitrarily nonconvex and twice continuously differentiable functions. The underestimators are derived by augmenting the original nonconvex function by a nonlinear relaxation function. The relaxation function is a separable convex function, that involves the sum of univariate parametric exponential functions. An efficient procedure that finds the appropriate values for those parameters is developed. This procedure uses interval arithmetic extensively in order to verify whether the new underestimator is convex. For arbitrarily nonconvex functions it is shown that these convex underestimators are tighter than those generated by the αBB method. Computational studies complemented with geometrical interpretations demonstrate the potential benefits of the proposed improved convex underestimators.
引用
收藏
页码:367 / 390
页数:23
相关论文
共 21 条