Binary Higher Order Neural Networks for Realizing Boolean Functions

被引:14
|
作者
Zhang, Chao [1 ]
Yang, Jie [1 ]
Wu, Wei [1 ]
机构
[1] Dalian Univ Technol, Sch Math Sci, Dalian 116023, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 05期
基金
中国国家自然科学基金;
关键词
Binary pi-sigma neural network; binary product-unit neural network; Boolean function; principle conjunctive normal form; principle disjunctive normal form; GRADIENT ALGORITHM; CONVERGENCE;
D O I
10.1109/TNN.2011.2114367
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to more efficiently realize Boolean functions by using neural networks, we propose a binary product-unit neural network (BPUNN) and a binary pi-sigma neural network (BPSNN). The network weights can be determined by one-step training. It is shown that the addition "sigma," the multiplication " pi," and two kinds of special weighting operations in BPUNN and BPSNN can implement the logical operators ".," ".," and " -" on Boolean algebra < Z(2), boolean OR, boolean AND, - 0, 1 > (Z(2) = {0, 1}), respectively. The proposed two neural networks enjoy the following advantages over the existing networks: 1) for a complete truth table of N variables with both truth and false assignments, the corresponding Boolean function can be realized by accordingly choosing a BPUNN or a BPSNN such that at most 2(N-1) hidden nodes are needed, while O(2(N)), precisely 2(N) or at most 2(N), hidden nodes are needed by existing networks; 2) a new network BPUPS based on a collaboration of BPUNN and BPSNN can be defined to deal with incomplete truth tables, while the existing networks can only deal with complete truth tables; and 3) the values of the weights are all simply -1 or 1, while the weights of all the existing networks are real numbers. Supporting numerical experiments are provided as well. Finally, we present the risk bounds of BPUNN, BPSNN, and BPUPS, and then analyze their probably approximately correct learnability.
引用
收藏
页码:701 / 713
页数:13
相关论文
共 50 条