Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks

被引:2
|
作者
Galatolo, Federico A. [1 ]
Cimino, Mario G. C. A. [1 ]
Vaglini, Gigliola [1 ]
机构
[1] Univ Pisa, Dept Informat Engn, I-56122 Pisa, Italy
关键词
Artificial Neural Networks; Recurrent Neural Network; Stigmergy; Deep Learning; Supervised Learning;
D O I
10.5220/0007581508300836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a novel architecture of Recurrent Neural Network (RNN) is designed and experimented. The proposed RNN adopts a computational memory based on the concept of stigmergy. The basic principle of a Stigmergic Memory (SM) is that the activity of deposit/removal of a quantity in the SM stimulates the next activities of deposit/removal. Accordingly, subsequent SM activities tend to reinforce/weaken each other, generating a coherent coordination between the SM activities and the input temporal stimulus. We show that, in a problem of supervised classification, the SM encodes the temporal input in an emergent representational model, by coordinating the deposit, removal and classification activities. This study lays down a basic framework for the derivation of a SM-RNN. A formal ontology of SM is discussed, and the SM-RNN architecture is detailed. To appreciate the computational power of an SM-RNN, comparative NNs have been selected and trained to solve the MNIST handwritten digits recognition benchmark in its two variants: spatial (sequences of bitmap rows) and temporal (sequences of pen strokes).
引用
收藏
页码:830 / 836
页数:7
相关论文
共 50 条
  • [31] Experimental Evaluation of Memory Capacity of Recurrent Neural Networks
    Kolesau, Aliaksei
    Sesok, Dmitrij
    Goranin, Nikolaj
    Rybokas, Mindaugas
    BALTIC JOURNAL OF MODERN COMPUTING, 2019, 7 (01): : 138 - 150
  • [32] Memory Analysis for Memristors and Memristive Recurrent Neural Networks
    Gang Bao
    Yide Zhang
    Zhigang Zeng
    IEEE/CAAJournalofAutomaticaSinica, 2020, 7 (01) : 96 - 105
  • [33] Memory in linear recurrent neural networks in continuous time
    Hermans, Michiel
    Schrauwen, Benjamin
    NEURAL NETWORKS, 2010, 23 (03) : 341 - 355
  • [34] Encoding-based memory for recurrent neural networks
    Carta, Antonio
    Sperduti, Alessandro
    Bacciu, Davide
    NEUROCOMPUTING, 2021, 456 (456) : 407 - 420
  • [35] Memory analysis for memristors and memristive recurrent neural networks
    Bao, Gang
    Zhang, Yide
    Zeng, Zhigang
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (01) : 96 - 105
  • [36] State-Frequency Memory Recurrent Neural Networks
    Hu, Hao
    Qi, Guo-Jun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [37] Associative memory by recurrent neural networks with delay elements
    Miyoshi, S
    Yanai, HF
    Okada, M
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 70 - 74
  • [38] Associative memory by recurrent neural networks with delay elements
    Miyoshi, S
    Yanai, HF
    Okada, M
    NEURAL NETWORKS, 2004, 17 (01) : 55 - 63
  • [39] Neural Mechanisms of Working Memory Accuracy Revealed by Recurrent Neural Networks
    Xie, Yuanqi
    Liu, Yichen Henry
    Constantinidis, Christos
    Zhou, Xin
    FRONTIERS IN SYSTEMS NEUROSCIENCE, 2022, 16
  • [40] Design of neural networks for solving computational problems
    ElBakry, HM
    AboElsoud, MA
    Soliman, HH
    ElMikati, HA
    THIRTEENTH NATIONAL RADIO SCIENCE CONFERENCE - NRSC'96, 1996, : 281 - 288