On the computational power of recurrent neural networks for structures

被引:25
|
作者
Sperduti, A
机构
[1] Computer Science Department, University of Pisa, 56125 Pisa
关键词
neural networks for structures; recurrent networks; labelled trees; tree automata; computational theory;
D O I
10.1016/S0893-6080(96)00105-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks can simulate any finite state automata as well as any multi-stack Turing machine. When constraining the network architecture, however, this computational power may no longer hold. For example, recurrent cascade-correlation cannot simulate any finite state automata. Thus, it is important to assess the computational power of a given network architecture, since this characterizes the class of functions which, in principle, can be computed by it. We discuss the computational power of neural networks for structures. Elman-style networks, cascade-correlation networks and neural trees for structures are introduced We show that Elman-style networks can simulate any frontier-to-root tree automation while neither cascade-correlation networks nor neural trees can. As a special case of the latter result, we obtain that neural trees for sequences cannot simulate any finite state machine. (C) 1997 Elsevier Science Ltd.
引用
收藏
页码:395 / 400
页数:6
相关论文
共 50 条
  • [21] On the Computational Power of Max-Min Propagation Neural Networks
    Pablo A. Estévez
    Yoichi Okabe
    Neural Processing Letters, 2004, 19 : 11 - 23
  • [22] Computational power of neural networks: A characterization in terms of Kolmogorov complexity
    Balcazar, JL
    Gavalda, R
    Siegelmann, HT
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1997, 43 (04) : 1175 - 1183
  • [23] Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware
    Diehl, Peter U.
    Zarrella, Guido
    Cassidy, Andrew
    Pedroni, Bruno U.
    Neftci, Emre
    2016 IEEE INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC), 2016,
  • [24] Effect of Alzheimer's disease on the dynamical and computational characteristics of recurrent neural networks
    Claudia Bachmann
    Tom Tetzlaff
    Susanne Kunkel
    Philipp Bamberger
    Abigail Morrison
    BMC Neuroscience, 14 (Suppl 1)
  • [25] Computational identification of human ubiquitination sites using convolutional and recurrent neural networks
    Wang, Xiaofeng
    Yan, Renxiang
    Wang, Yongji
    MOLECULAR OMICS, 2021, 17 (06) : 948 - 955
  • [26] Investigating Recurrent Neural Networks for Feature-Less Computational Drug Design
    Doerr, Alexander
    Otte, Sebastian
    Zell, Andreas
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 140 - 148
  • [27] Recurrent neural networks
    Siegelmann, HT
    COMPUTER SCIENCE TODAY, 1995, 1000 : 29 - 45
  • [28] Space-Time Structures of Recurrent Neural Networks with Controlled Synapses
    Osipov, Vasiliy
    ADVANCES IN NEURAL NETWORKS - ISNN 2016, 2016, 9719 : 177 - 184
  • [29] Recurrent Neural Networks with Mixed Hierarchical Structures for Natural Language Processing
    Luo, Zhaoxin
    Zhu, Michael
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [30] Training recurrent neural networks as generative neural networks for molecular structures: how does it impact drug discovery?
    D'Souza, Sofia
    Kv, Prema
    Balaji, Seetharaman
    EXPERT OPINION ON DRUG DISCOVERY, 2022, 17 (10) : 1071 - 1079