Binary Higher Order Neural Networks for Realizing Boolean Functions

被引:14
|
作者
Zhang, Chao [1 ]
Yang, Jie [1 ]
Wu, Wei [1 ]
机构
[1] Dalian Univ Technol, Sch Math Sci, Dalian 116023, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 05期
基金
中国国家自然科学基金;
关键词
Binary pi-sigma neural network; binary product-unit neural network; Boolean function; principle conjunctive normal form; principle disjunctive normal form; GRADIENT ALGORITHM; CONVERGENCE;
D O I
10.1109/TNN.2011.2114367
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to more efficiently realize Boolean functions by using neural networks, we propose a binary product-unit neural network (BPUNN) and a binary pi-sigma neural network (BPSNN). The network weights can be determined by one-step training. It is shown that the addition "sigma," the multiplication " pi," and two kinds of special weighting operations in BPUNN and BPSNN can implement the logical operators ".," ".," and " -" on Boolean algebra < Z(2), boolean OR, boolean AND, - 0, 1 > (Z(2) = {0, 1}), respectively. The proposed two neural networks enjoy the following advantages over the existing networks: 1) for a complete truth table of N variables with both truth and false assignments, the corresponding Boolean function can be realized by accordingly choosing a BPUNN or a BPSNN such that at most 2(N-1) hidden nodes are needed, while O(2(N)), precisely 2(N) or at most 2(N), hidden nodes are needed by existing networks; 2) a new network BPUPS based on a collaboration of BPUNN and BPSNN can be defined to deal with incomplete truth tables, while the existing networks can only deal with complete truth tables; and 3) the values of the weights are all simply -1 or 1, while the weights of all the existing networks are real numbers. Supporting numerical experiments are provided as well. Finally, we present the risk bounds of BPUNN, BPSNN, and BPUPS, and then analyze their probably approximately correct learnability.
引用
收藏
页码:701 / 713
页数:13
相关论文
共 50 条
  • [41] A Novel Higher Order Artificial Neural Networks
    Xu, Shuxiang
    ISCM II AND EPMESC XII, PTS 1 AND 2, 2010, 1233 : 1507 - 1511
  • [42] Systolic architecture for higher order neural networks
    Bilski, J
    Smolag, J
    NEURAL NETWORKS AND SOFT COMPUTING, 2003, : 796 - 801
  • [43] Lower bounds on the higher order nonlinearities of Boolean functions and their applications to the inverse function
    Carlet, Claude
    2008 IEEE INFORMATION THEORY WORKSHOP, 2008, : 333 - 337
  • [44] On the global avalanche characteristics between two Boolean functions and the higher order nonlinearity
    Zhou, Yu
    Xie, Min
    Xiao, Guozhen
    INFORMATION SCIENCES, 2010, 180 (02) : 256 - 265
  • [45] Submodularity, supermodularity, and higher-order monotonicities of pseudo-boolean functions
    Foldes, S
    Hammer, PL
    MATHEMATICS OF OPERATIONS RESEARCH, 2005, 30 (02) : 453 - 461
  • [46] Neural networks for determining affinity functions of binary objects
    Dmitrievich, Dmitrienko Valery
    Yurievich, Leonov Sergey
    Yurievich, Zakovorotniy Alexander
    Viktorovich, Mezentsev Nikolay
    2020 IEEE KHPI WEEK ON ADVANCED TECHNOLOGY (KHPI WEEK), 2020, : 478 - 481
  • [47] Realizing Boolean functions using Probabilistic Spin Logic (PSL)
    Agarwal, Vaibhav
    Saurabh, Sneh
    2019 32ND INTERNATIONAL CONFERENCE ON VLSI DESIGN AND 2019 18TH INTERNATIONAL CONFERENCE ON EMBEDDED SYSTEMS (VLSID), 2019, : 508 - 509
  • [48] Boolean networks with veto functions
    Ebadi, Haleh
    Klemm, Konstantin
    PHYSICAL REVIEW E, 2014, 90 (02)
  • [49] BOOLEAN ALGEBRAS OF LOGICS OF HIGHER ORDER
    AMER, MA
    HANF, WP
    NOTICES OF THE AMERICAN MATHEMATICAL SOCIETY, 1969, 16 (07): : 1059 - &
  • [50] Scale equalization higher-order neural networks
    Wang, JH
    Wu, KH
    Chang, FC
    PROCEEDINGS OF THE 2004 IEEE INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION (IRI-2004), 2004, : 612 - 617