A modular architecture for transparent computation in recurrent neural networks

被引:11
|
作者
Carmantini, Giovanni S. [1 ]
Graben, Peter Beim [2 ]
Desroches, Mathieu [3 ]
Rodrigues, Serafim [1 ]
机构
[1] Univ Plymouth, Sch Comp & Math, Plymouth, Devon, England
[2] Humboldt Univ, Bernstein Ctr Computat Neurosci Berlin, Berlin, Germany
[3] Inria Sophia Antipolis Mediterranee, Valbonne, France
关键词
Automata Theory; Recurrent artificial neural networks; Representation theory; Nonlinear dynamical automata; Neural symbolic computation; Versatile shift; CENTRAL PATTERN GENERATORS; DYNAMICAL-SYSTEM; MODEL; POWER; REPRESENTATION; UNPREDICTABILITY; BACKPROPAGATION; UNDECIDABILITY; INFORMATION; LOCOMOTION;
D O I
10.1016/j.neunet.2016.09.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Godelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. (C) 2016 Elsevier Ltd. All rights reserved.
引用
收藏
页码:85 / 105
页数:21
相关论文
共 50 条
  • [21] Neural networks and child language development: A simulation using a modular neural network architecture
    Abidi, SSR
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 840 - 845
  • [22] Multi-objective Evolutionary Neural Architecture Search for Recurrent Neural Networks
    Booysen, Reinhard
    Bosman, Anna Sergeevna
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [23] An extended architecture of recurrent neural networks that latches input information
    Ster, B
    Dobnikar, A
    ARTIFICIAL NEURAL NETS AND GENETIC ALGORITHMS, PROCEEDINGS, 2003, : 33 - 37
  • [24] Enhancement of neural representation capacity by modular architecture in networks of cortical neurons
    Levy, Ofri
    Ziv, Noam E.
    Marom, Shimon
    EUROPEAN JOURNAL OF NEUROSCIENCE, 2012, 35 (11) : 1753 - 1760
  • [25] Dynamical complexity and computation in recurrent neural networks beyond their fixed point
    Bicky A. Marquez
    Laurent Larger
    Maxime Jacquot
    Yanne K. Chembo
    Daniel Brunner
    Scientific Reports, 8
  • [26] Spike-based computation using classical recurrent neural networks
    De Geeter, Florent
    Ernst, Damien
    Drion, Guillaume
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2024, 4 (02):
  • [27] Real-time computation at the edge of chaos in recurrent neural networks
    Bertschinger, N
    Natschläger, T
    NEURAL COMPUTATION, 2004, 16 (07) : 1413 - 1436
  • [28] Dynamical complexity and computation in recurrent neural networks beyond their fixed point
    Marquez, Bicky A.
    Larger, Laurent
    Jacquot, Maxime
    Chembo, Yanne K.
    Brunner, Daniel
    SCIENTIFIC REPORTS, 2018, 8
  • [29] A model of natural computation based on recurrent neural networks and reciprocal images
    Greer, Douglas S.
    Proceedings of the Second IASTED International Conference on Computational Intelligence, 2006, : 200 - 205
  • [30] Computation in recurrent neural networks: From counters to iterated function systems
    Kalinke, Y
    Lehmann, H
    ADVANCED TOPICS IN ARTIFICIAL INTELLIGENCE, 1998, 1502 : 179 - 190