Probabilistic quality estimations for combinatorial optimization problems

被引:1
|
作者
Vakhania, Nodari [1 ,2 ]
机构
[1] UAEMor, Ctr Invest Ciencias, Cuernavaca, Morelos, Mexico
[2] Georgian Tech Univ, N Muskhelishvili Inst Computat Math, Tbilisi, Georgia
关键词
Computational algorithm; computational (time) complexity; worst-case estimation; average-case estimation;
D O I
10.1515/gmj-2017-0041
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The computational complexity of an algorithm is traditionally measured for the worst and the average case. The worst-case estimation guarantees a certain worst-case behavior of a given algorithm, although it might be rough, since in "most instances" the algorithm may have a significantly better performance. The probabilistic average-case analysis claims to derive an average performance of an algorithm, say, for an "average instance" of the problem in question. That instance may be far away from the average of the problem instances arising in a given real-life application, and so the average case analysis would also provide a non-realistic estimation. We suggest that, in general, a wider use of probabilistic models for a more accurate estimation of the algorithm efficiency could be possible. For instance, the quality of the solutions delivered by an approximation algorithm may also be estimated in the "average" probabilistic case. Such an approach would deal with the estimation of the quality of the solutions delivered by the algorithm for the most common (for a given application) problem instances. As we illustrate, the probabilistic modeling can also be used to derive an accurate time complexity performance measure, distinct from the traditional probabilistic average-case time complexity measure. Such an approach could, in particular, be useful when the traditional average-case estimation is still rough or is not possible at all.
引用
收藏
页码:123 / 134
页数:12
相关论文
共 50 条