Noise Quality and Super-Turing Computation in Recurrent Neural Networks

被引:0
|
作者
Redd, Emmett [1 ]
Obafemi-Ajayi, Tayo [1 ]
机构
[1] Missouri State Univ, Springfield, MO 65897 USA
关键词
Super-turing; Recurrent neural networks; Chaos; Pseudo-random noise;
D O I
10.1007/978-3-030-86380-7_38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noise and stochasticity can be beneficial to the performance of neural networks. Recent studies show that optimized-magnitude, noise-enhanced digital recurrent neural networks are consistent with super-Turing operation. This occurred regardless of whether true random or sufficiently long pseudo-random number time series implementing the noise were used. This paper extends prior work by providing additional insight into the degrading effect of shortened and repeating pseudo-noise sequences on super-Turing operation. Shortening the repeat length in the noise resulted in fewer chaotic time series. This was measured by autocorrelation detected repetitions in the output. Similar rates of chaos inhibition by the shortening of the noise repeat lengths hint to an unknown, underlying commonality in noise-induced chaos among different maps, noise magnitudes, and pseudo-noise functions. Repeat lengths in the chaos-failed outputs were predominately integer multiples of the noise repeat lengths. Noise repeat lengths only marginally shorter than output sequences cause the noise-enhanced digital recurrent neural networks to repeat and, thereby, fail in being consistent with chaos and super-Turing computation. This implies that noise sequences used to improve neural network operation should be at least as long as any sequence it produces.
引用
收藏
页码:469 / 478
页数:10
相关论文
共 50 条
  • [1] Noise optimizes super-Turing computation in recurrent neural networks
    Redd, Emmett
    Senger, Steven
    Obafemi-Ajayi, Tayo
    PHYSICAL REVIEW RESEARCH, 2021, 3 (01):
  • [2] Recurrent Neural Networks and Super-Turing Interactive Computation
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    ARTIFICIAL NEURAL NETWORKS, 2015, : 1 - 29
  • [3] Evolving Recurrent Neural Networks are Super-Turing
    Cabessa, Jeremie
    Siegelmann, Hava T.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 3200 - 3206
  • [4] On Super-Turing Neural Computation
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    ADVANCES IN COGNITIVE NEURODYNAMICS (IV), 2015, : 307 - 312
  • [5] INTERACTIVE EVOLVING RECURRENT NEURAL NETWORKS ARE SUPER-TURING
    Cabessa, Jeremie
    ICAART: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 1, 2012, : 328 - 333
  • [6] THE SUPER-TURING COMPUTATIONAL POWER OF PLASTIC RECURRENT NEURAL NETWORKS
    Cabessa, Jeremie
    Siegelmann, Hava T.
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2014, 24 (08)
  • [7] The Super-Turing Computational Power of Interactive Evolving Recurrent Neural Networks
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2013, 2013, 8131 : 58 - 65
  • [8] Neural and super-Turing computing
    Siegelmann, HT
    MINDS AND MACHINES, 2003, 13 (01) : 103 - 114
  • [9] Neural and Super-Turing Computing
    Hava T. Siegelmann
    Minds and Machines, 2003, 13 : 103 - 114
  • [10] Super-Turing or Non-Turing? Extending the Concept of Computation
    MacLennan, Bruce J.
    INTERNATIONAL JOURNAL OF UNCONVENTIONAL COMPUTING, 2009, 5 (3-4) : 369 - 387