On the computational power of recurrent neural networks for structures

被引:25
|
作者
Sperduti, A
机构
[1] Computer Science Department, University of Pisa, 56125 Pisa
关键词
neural networks for structures; recurrent networks; labelled trees; tree automata; computational theory;
D O I
10.1016/S0893-6080(96)00105-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks can simulate any finite state automata as well as any multi-stack Turing machine. When constraining the network architecture, however, this computational power may no longer hold. For example, recurrent cascade-correlation cannot simulate any finite state automata. Thus, it is important to assess the computational power of a given network architecture, since this characterizes the class of functions which, in principle, can be computed by it. We discuss the computational power of neural networks for structures. Elman-style networks, cascade-correlation networks and neural trees for structures are introduced We show that Elman-style networks can simulate any frontier-to-root tree automation while neither cascade-correlation networks nor neural trees can. As a special case of the latter result, we obtain that neural trees for sequences cannot simulate any finite state machine. (C) 1997 Elsevier Science Ltd.
引用
收藏
页码:395 / 400
页数:6
相关论文
共 50 条
  • [1] The Computational Power of Interactive Recurrent Neural Networks
    Cabessa, Jeremie
    Siegelmann, Hava T.
    [J]. NEURAL COMPUTATION, 2012, 24 (04) : 996 - 1019
  • [2] THE SUPER-TURING COMPUTATIONAL POWER OF PLASTIC RECURRENT NEURAL NETWORKS
    Cabessa, Jeremie
    Siegelmann, Hava T.
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2014, 24 (08)
  • [3] The Super-Turing Computational Power of Interactive Evolving Recurrent Neural Networks
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2013, 2013, 8131 : 58 - 65
  • [4] Computational capabilities of recurrent NARX neural networks
    Siegelmann, HT
    Horne, BG
    Giles, CL
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (02): : 208 - 215
  • [5] ON THE COMPUTATIONAL POWER OF ELMAN-STYLE RECURRENT NETWORKS
    KREMER, SC
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04): : 1000 - 1004
  • [6] Delay and Recurrent Neural Networks: Computational Cybernetics of Systems Biology?
    Dimirovski, Georgi M.
    Wang, Rui
    Yang, Bin
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 906 - 911
  • [7] Reconstructing computational system dynamics from neural data with recurrent neural networks
    Daniel Durstewitz
    Georgia Koppe
    Max Ingo Thurm
    [J]. Nature Reviews Neuroscience, 2023, 24 : 693 - 710
  • [8] Computational Capabilities of Recurrent Neural Networks Based on their Attractor Dynamics
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    [J]. 2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [9] Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks
    Galatolo, Federico A.
    Cimino, Mario G. C. A.
    Vaglini, Gigliola
    [J]. ICPRAM: PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2019, : 830 - 836
  • [10] Reconstructing computational system dynamics from neural data with recurrent neural networks
    Durstewitz, Daniel
    Koppe, Georgia
    Thurm, Max Ingo
    [J]. NATURE REVIEWS NEUROSCIENCE, 2023, 24 (11) : 693 - 710