ON THE PERFORMANCE GUARANTEE OF NEURAL NETWORKS FOR NP-HARD OPTIMIZATION PROBLEMS

被引:0
|
作者
ZISSIMOPOULOS, V [1 ]
机构
[1] UNIV PARIS 11,LRI,CNRS,URA 410,BAT 490,F-91405 ORSAY,FRANCE
关键词
COMBINATORIAL PROBLEMS; ANALYSIS OF ALGORITHMS; COMBINATORIAL OPTIMIZATION; NEURAL NETWORKS; MAXIMUM INDEPENDENT SET; HEURISTICS; APPROXIMATION ALGORITHMS; WORST-CASE ANALYSIS;
D O I
10.1016/0020-0190(95)00051-D
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We give polynomial size threshold neural networks and encoding formalisms, which guarantee worst case performance for two hard optimization problems. We show that a massively parallel algorithm based on such neural network models guarantee an approximation ratio, asymptotically equal to DELTA/2 for the maximum independent set problem, where DELTA is the maximum degree of the graph, and equal to 2 for the vertex covering problem. These results on the power of polynomial size threshold neural networks within polynomial number of neural updates provide the first approximation results for neural network models.
引用
收藏
页码:317 / 322
页数:6
相关论文
共 50 条