Mapping Boolean functions with neural networks having binary weights and zero thresholds

被引:6
|
作者
Deolalikar, V [1 ]
机构
[1] Hewlett Packard Labs, Palo Alto, CA 94304 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2001年 / 12卷 / 03期
关键词
binary neural networks; Boolean function mapping; one-layer networks; two-layer networks;
D O I
10.1109/72.925568
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the ability of a binary neural-network comprising only neurons with zero thresholds and binary weights to map given samples of a Boolean function is studied, A mathematical model describing a network with such restrictions is developed. It is shown that this model is quite amenable to algebraic manipulation, A key feature of the model is that it replaces the two input and output variables with a single "normalized" variable. The model is then used to provide a priori criteria, stated in terms of the new variable, that a given Boolean function must satisfy in order to he mapped by a network having one or two layers. These criteria provide necessary, and in the case of a one-layer network, sufficient conditions for samples of a Boolean function to be mapped by a binary neural network with zero thresholds. It is shown that the necessary conditions imposed by the two-layer network are, in some sense, minimal.
引用
收藏
页码:639 / 642
页数:4
相关论文
共 50 条
  • [21] Approximation to Boolean functions by neural networks with applications to thinning algorithms
    Xiong, SS
    Zhou, ZY
    Zhong, LM
    Zhang, WD
    IMTC/2000: PROCEEDINGS OF THE 17TH IEEE INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE: SMART CONNECTIVITY: INTEGRATING MEASUREMENT AND CONTROL, 2000, : 1004 - 1008
  • [22] Generalization ability of Boolean functions implemented in feedforward neural networks
    Franco, Leonardo
    NEUROCOMPUTING, 2006, 70 (1-3) : 351 - 361
  • [23] Classifying Three-input Boolean Functions by Neural Networks
    Sakamoto, Naoshi
    2019 20TH IEEE/ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ARTIFICIAL INTELLIGENCE, NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING (SNPD), 2019, : 297 - 301
  • [24] On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights
    Yu, Dansheng
    Qian, Yunyou
    Li, Fengjun
    ANALYSIS IN THEORY AND APPLICATIONS, 2023, 39 (01): : 93 - 104
  • [25] Capacity of two-layer feedforward neural networks with binary weights
    Ji, CY
    Psaltis, D
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (01) : 256 - 268
  • [26] Efficient Quantization for Neural Networks with Binary Weights and Low Bitwidth Activations
    Huang, Kun
    Ni, Bingbing
    Yang, Xiaokang
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3854 - 3861
  • [27] Storage capacity of multi-layered neural networks with binary weights
    Tarkowski, W
    vanHemmen, JL
    ACTA PHYSICA POLONICA B, 1997, 28 (07): : 1707 - 1728
  • [28] Increasing Information Entropy of Both Weights and Activations for the Binary Neural Networks
    Zou, Wanbing
    Cheng, Song
    Wang, Luyuan
    Fu, Guanyu
    Shang, Delong
    Zhou, Yumei
    Zhan, Yi
    ELECTRONICS, 2021, 10 (16)
  • [29] BinaryConnect: Training Deep Neural Networks with binary weights during propagations
    Courbariaux, Matthieu
    Bengio, Yoshua
    David, Jean-Pierre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [30] Information theoretical approach to the storage capacity of neural networks with binary weights
    Suyari, H
    Matsuba, I
    PHYSICAL REVIEW E, 1999, 60 (04): : 4576 - 4579