PARTITIONING CAPABILITIES OF 2-LAYER NEURAL NETWORKS

被引:10
|
作者
MAKHOUL, J [1 ]
ELJAROUDI, A [1 ]
SCHWARTZ, R [1 ]
机构
[1] UNIV PITTSBURGH,DEPT ELECT ENGN,PITTSBURGH,PA 15261
关键词
D O I
10.1109/78.136554
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
It has been observed that feedforward neural nets with a single hidden layer are capable of forming either convex decision regions or nonconvex but connected decision regions in the input space. In this correspondence, we show that two-layer nets with a single hidden layer are capable of forming disconnected decision regions as well. In addition to giving examples of the phenomenon, we explain why and how disconnected decision regions are formed. Assuming neural nodes with threshold elements, we first derive an expression for the number of cells that are formed in the input space by the hyperplanes associated with the first (hidden) layer. This expression can be useful in deciding how many nodes to have in the first layer. Each hyperplane in the second layer then determines a decision region in the input space which consists of a number of cells that are typically connected to each other. However, through the hypothesization of the existence of additional virtual cells formed by the first layer, we show how the decision regions formed by the second layer can indeed be disconnected. Far from being isolated examples, we show that the number of such disconnected regions can be very large. Using a recent theoretical result about the sufficiency of two layers to approximate arbitrary decision regions in a finite portion of the space, we give an example of how that is possible with the use of virtual cells.
引用
收藏
页码:1435 / 1440
页数:6
相关论文
共 50 条
  • [1] GENERALIZATION IN 2-LAYER NEURAL NETWORKS
    KANG, K
    OH, JH
    KWON, C
    PARK, Y
    [J]. JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 1993, 26 : S396 - S398
  • [2] On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks
    Neumayer, Sebastian
    Chizat, Lenaic
    Unser, Michael
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [3] STORAGE CAPACITY AND LEARNING ALGORITHMS FOR 2-LAYER NEURAL NETWORKS
    ENGEL, A
    KOHLER, HM
    TSCHEPKE, F
    VOLLMAYR, H
    ZIPPELIUS, A
    [J]. PHYSICAL REVIEW A, 1992, 45 (10): : 7590 - 7609
  • [4] LEARNING-BEHAVIOR AND TEMPORARY MINIMA OF 2-LAYER NEURAL NETWORKS
    ANNEMA, AJ
    HOEN, K
    WALLINGA, H
    [J]. NEURAL NETWORKS, 1994, 7 (09) : 1387 - 1404
  • [5] On the partitioning capabilities of feedforward neural networks with sigmoid nodes
    Koutroumbas, K
    [J]. NEURAL COMPUTATION, 2003, 15 (10) : 2457 - 2481
  • [6] GENERALIZATION IN A 2-LAYER NEURAL NETWORK
    SCHWARZE, H
    OPPER, M
    KINZEL, W
    [J]. PHYSICAL REVIEW A, 1992, 46 (10): : R6185 - R6188
  • [8] Modeling Signed Networks as 2-Layer Growing Networks
    Pandey, Pradumn Kumar
    Adhikari, Bibhas
    Mazumdar, Mainak
    Ganguly, Niloy
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (07) : 3377 - 3390
  • [9] GENERALIZATION IN A 2-LAYER NEURAL-NETWORK
    KANG, KJ
    OH, JH
    KWON, C
    PARK, Y
    [J]. PHYSICAL REVIEW E, 1993, 48 (06): : 4805 - 4809
  • [10] 2-LAYER MODELING FOR LOCAL AREA NETWORKS
    MURATA, M
    TAKAGI, H
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 1988, 36 (09) : 1022 - 1034