Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks

被引:1
|
作者
Galatolo, Federico A. [1 ]
Cimino, Mario G. C. A. [1 ]
Vaglini, Gigliola [1 ]
机构
[1] Univ Pisa, Dept Informat Engn, I-56122 Pisa, Italy
关键词
Artificial Neural Networks; Recurrent Neural Network; Stigmergy; Deep Learning; Supervised Learning;
D O I
10.5220/0007581508300836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a novel architecture of Recurrent Neural Network (RNN) is designed and experimented. The proposed RNN adopts a computational memory based on the concept of stigmergy. The basic principle of a Stigmergic Memory (SM) is that the activity of deposit/removal of a quantity in the SM stimulates the next activities of deposit/removal. Accordingly, subsequent SM activities tend to reinforce/weaken each other, generating a coherent coordination between the SM activities and the input temporal stimulus. We show that, in a problem of supervised classification, the SM encodes the temporal input in an emergent representational model, by coordinating the deposit, removal and classification activities. This study lays down a basic framework for the derivation of a SM-RNN. A formal ontology of SM is discussed, and the SM-RNN architecture is detailed. To appreciate the computational power of an SM-RNN, comparative NNs have been selected and trained to solve the MNIST handwritten digits recognition benchmark in its two variants: spatial (sequences of bitmap rows) and temporal (sequences of pen strokes).
引用
收藏
页码:830 / 836
页数:7
相关论文
共 50 条
  • [1] Using Stigmergy to Incorporate the Time into Artificial Neural Networks
    Galatolo, Federico A.
    Cimino, Mario Giovanni C. A.
    Vaglini, Gigliola
    [J]. MINING INTELLIGENCE AND KNOWLEDGE EXPLORATION, MIKE 2018, 2018, 11308 : 248 - 258
  • [2] Symmetric cipher design using recurrent neural networks
    Arvandi, M.
    Wu, S.
    Sadeghian, A.
    Melek, W. W.
    Woungang, I.
    [J]. 2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 2039 - 2046
  • [3] Investigating Recurrent Neural Networks for Feature-Less Computational Drug Design
    Doerr, Alexander
    Otte, Sebastian
    Zell, Andreas
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 140 - 148
  • [4] The Computational Power of Interactive Recurrent Neural Networks
    Cabessa, Jeremie
    Siegelmann, Hava T.
    [J]. NEURAL COMPUTATION, 2012, 24 (04) : 996 - 1019
  • [5] Computational capabilities of recurrent NARX neural networks
    Siegelmann, HT
    Horne, BG
    Giles, CL
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (02): : 208 - 215
  • [6] On the computational power of recurrent neural networks for structures
    Sperduti, A
    [J]. NEURAL NETWORKS, 1997, 10 (03) : 395 - 400
  • [7] Using recurrent neural networks to optimize dynamical decoupling for quantum memory
    August, Moritz
    Ni, Xiaotong
    [J]. PHYSICAL REVIEW A, 2017, 95 (01)
  • [8] Memory augmented recurrent neural networks for de-novo drug design
    Suresh, Naveen
    Kumar, Neelesh Chinnakonda Ashok
    Subramanian, Srikumar
    Srinivasa, Gowri
    [J]. PLOS ONE, 2022, 17 (06):
  • [9] On the design, analysis, and characterization of materials using computational neural networks
    Sumpter, BG
    Noid, DW
    [J]. ANNUAL REVIEW OF MATERIALS SCIENCE, 1996, 26 : 223 - 277
  • [10] Nonlinear observer design using dynamic recurrent neural networks
    Kim, YH
    Lewis, FL
    Abdallah, CT
    [J]. PROCEEDINGS OF THE 35TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-4, 1996, : 949 - 954