Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks

被引:2
|
作者
Galatolo, Federico A. [1 ]
Cimino, Mario G. C. A. [1 ]
Vaglini, Gigliola [1 ]
机构
[1] Univ Pisa, Dept Informat Engn, I-56122 Pisa, Italy
关键词
Artificial Neural Networks; Recurrent Neural Network; Stigmergy; Deep Learning; Supervised Learning;
D O I
10.5220/0007581508300836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a novel architecture of Recurrent Neural Network (RNN) is designed and experimented. The proposed RNN adopts a computational memory based on the concept of stigmergy. The basic principle of a Stigmergic Memory (SM) is that the activity of deposit/removal of a quantity in the SM stimulates the next activities of deposit/removal. Accordingly, subsequent SM activities tend to reinforce/weaken each other, generating a coherent coordination between the SM activities and the input temporal stimulus. We show that, in a problem of supervised classification, the SM encodes the temporal input in an emergent representational model, by coordinating the deposit, removal and classification activities. This study lays down a basic framework for the derivation of a SM-RNN. A formal ontology of SM is discussed, and the SM-RNN architecture is detailed. To appreciate the computational power of an SM-RNN, comparative NNs have been selected and trained to solve the MNIST handwritten digits recognition benchmark in its two variants: spatial (sequences of bitmap rows) and temporal (sequences of pen strokes).
引用
收藏
页码:830 / 836
页数:7
相关论文
共 50 条
  • [21] RNNFast: An Accelerator for Recurrent Neural Networks Using Domain-Wall Memory
    Samavatian, Mohammad Hossein
    Bacha, Anys
    Zhou, Li
    Teodorescu, Radu
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2020, 16 (04)
  • [22] A working memory model based on recurrent neural networks using reinforcement learning
    Wang, Mengyuan
    Wang, Yihong
    Xu, Xuying
    Pan, Xiaochuan
    COGNITIVE NEURODYNAMICS, 2024, 18 (05) : 3031 - 3058
  • [23] Identifying Dynamic Memory Effects on Vegetation State Using Recurrent Neural Networks
    Kraft, Basil
    Jung, Martin
    Koerner, Marco
    Mesa, Christian Requena
    Cortes, Jose
    Reichstein, Markus
    FRONTIERS IN BIG DATA, 2019, 2
  • [24] Using Recurrent Neural Networks to Model Spatial Grammars for Design Creation
    Yukish, Michael A.
    Stump, Gary M.
    Miller, Simon W.
    JOURNAL OF MECHANICAL DESIGN, 2020, 142 (10)
  • [25] Recurrent neural networks for musical pitch memory and classification
    Franklin, JA
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2005, 14 (1-2) : 329 - 342
  • [26] Memory capacity of recurrent neural networks with matrix representation
    Renanse, Animesh
    Sharma, Alok
    Chandra, Rohitash
    NEUROCOMPUTING, 2023, 560
  • [27] Learning Rule for Associative Memory in Recurrent Neural Networks
    Jacob, Theju
    Snyder, Wesley
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [28] Separation of Memory and Processing in Dual Recurrent Neural Networks
    Oliva, Christian
    Lago-Fernandez, Luis F.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 360 - 371
  • [29] Memory-Efficient Backpropagation for Recurrent Neural Networks
    Ayoub, Issa
    Al Osman, Hussein
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11489 : 274 - 283
  • [30] AN EVOLUTIONARY APPROACH TO ASSOCIATIVE MEMORY IN RECURRENT NEURAL NETWORKS
    FUJITA, S
    NISHIMURA, H
    NEURAL PROCESSING LETTERS, 1994, 1 (02) : 9 - 13