On flexible neural networks: Some system-theoretic properties and a new class

被引:0
|
作者
Bavafa-Toosi, Yazdan [1 ]
Ohmori, Hiromitsu [1 ]
机构
[1] Keio Univ, Sch Integrated Design Engn, Yokohama, Kanagawa 2238522, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Although flexible neural networks (FNNs) have been used more successfully than classical neural networks (CNNs), nothing is rigorously known about their properties. In fact, they are not even well known to the systems and control community. In this paper, theoretical evidence is given for their superiority over CNNs. Following an overview of flexible bipolar sigmoid functions (FBSFs), several fundamental properties of feedforward and recurrent FNNs are established. For the feedforward case, it is proven that similar to CNNs, FNNs with as few as a single hidden layer (SHL) are universal approximators. It is also proven that unlike irreducible SHL CBSNNs, irreducible SHL FBSNNs are nonuniquely determined by their input-output (I-O) maps, up to a finite group of symmetries. Then,. recurrent FNNs are introduced. It is observed that they can be interpreted as a generalization of the conventional state-space framework. For the recurrent case, it is substantiated that similar to CBSNNs, FBSNNs are universal approximators. Necessary and sufficient conditions for the controllability and observability of a generic class, of them are established. For a subclass of this class, it is proven that unlike CBSNNs, FBSNNs are nonuniquely determined by their I-O maps, up to a finite group of symmetries, and that every system inside this subclass is minimal. Finally, a new class of FNNs, namely, flexible bipolar radial basis neural networks (FBRBNNs) is introduced. It is proven that as in the case of classical radial basis neural networks (CRBNNs), feedforward SHL FBRBNNs are universal approximators.
引用
收藏
页码:2554 / 2561
页数:8
相关论文
共 50 条
  • [21] Serial production lines with geometric machines and finite production runs: performance analysis and system-theoretic properties
    Jia, Zhiyang
    Zhang, Liang
    [J]. INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH, 2019, 57 (08) : 2247 - 2262
  • [22] SOME GLOBAL PROPERTIES OF NEURAL NETWORKS
    ACCARDI, L
    AIELLO, A
    [J]. KYBERNETIK, 1972, 10 (02): : 115 - +
  • [23] SOME PROPERTIES OF DEFUZZIFICATION NEURAL NETWORKS
    SONG, Q
    BORTOLAN, G
    [J]. FUZZY SETS AND SYSTEMS, 1994, 61 (01) : 83 - 89
  • [24] Some properties of the assembly neural networks
    Goltsev, Alexander
    Húsek, Dušan
    [J]. Neural Network World, 2002, 12 (01) : 15 - 32
  • [25] A new class of neural networks and its applications
    Bouallegue, Kais
    [J]. NEUROCOMPUTING, 2017, 249 : 28 - 47
  • [26] Chaos of a new class of Hopfield neural networks
    Huang, Wen-Zhi
    Huang, Yan
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2008, 206 (01) : 1 - 11
  • [27] A flexible visual inspection system based on neural networks
    Liatsis, P.
    Goulermas, J. Y.
    Zeng, X-J.
    Milonidis, E.
    [J]. INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2009, 40 (02) : 173 - 186
  • [28] A class of new endomorphisms with some properties
    Huang Xurong
    Hai Jinke
    Chang Xiufeng
    [J]. PROCEEDINGS OF THE 2ND INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2016), 2016, 24 : 234 - 237
  • [29] SOME NETWORK-THEORETIC PROPERTIES OF NONLINEAR DC TRANSISTOR NETWORKS
    SANDBERG, IW
    WILLSON, AN
    [J]. BELL SYSTEM TECHNICAL JOURNAL, 1969, 48 (05): : 1293 - +
  • [30] Some stability properties of dynamic neural networks
    Yu, W
    Li, XO
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-FUNDAMENTAL THEORY AND APPLICATIONS, 2001, 48 (02): : 256 - 259