Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

被引:20
|
作者
Gosti, Giorgio [1 ]
Folli, Viola [1 ]
Leonetti, Marco [1 ,2 ]
Ruocco, Giancarlo [1 ,3 ]
机构
[1] Ist Italiano Tecnol, Ctr Life Nanosci, Viale Regina Elena 291, I-00161 Rome, Italy
[2] Univ Salento, NANOTEC, Inst Nanotechnol, CNR, Campus Ecotekne,Via Monteroni, I-73100 Lecce, Italy
[3] Sapienza Univ Rome, Dept Phys, Piazzale Aldo Moro 5, I-00185 Rome, Italy
关键词
recurrent neural networks; Hopfield neural networks; pattern storage; SPIN-GLASS; MODELS;
D O I
10.3390/e21080726
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14N, as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Storage capacity of rotor Hopfield neural networks
    Kobayashi, Masaki
    NEUROCOMPUTING, 2018, 316 : 30 - 33
  • [2] Storage capacity of hyperbolic Hopfield neural networks
    Kobayashi, Masaki
    NEUROCOMPUTING, 2019, 369 (185-190) : 185 - 190
  • [3] MAXIMUM STORAGE CAPACITY IN NEURAL NETWORKS
    GARDNER, E
    EUROPHYSICS LETTERS, 1987, 4 (04): : 481 - 485
  • [4] Optimal storage capacity of quantum Hopfield neural networks
    Boedeker, Lukas
    Fiorelli, Eliana
    Miller, Markus
    PHYSICAL REVIEW RESEARCH, 2023, 5 (02):
  • [5] Critical capacity of Hopfield neural networks with optimal storage
    Guo, Donghui
    Chen, Zhenxiang
    Liu, Ruitang
    Wu, Boxi
    Zheng, Liming
    Zheng, Lilong
    Jisuanji Xuebao/Chinese Journal of Computers, 1997, 20 (01): : 77 - 81
  • [6] On the Maximum Storage Capacity of the Hopfield Model
    Folli, Viola
    Leonetti, Marco
    Ruocco, Giancarlo
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2017, 10
  • [7] Storage capacity of the Hopfield neural network
    Zheng, JC
    Chen, JY
    Shuai, JW
    Cai, SH
    Wang, RZ
    PHYSICA A, 1997, 246 (3-4): : 313 - 319
  • [8] Quantum Hopfield Neural Networks: A New Approach and Its Storage Capacity
    Meinhardt, Nicholas
    Neumann, Niels M. P.
    Phillipson, Frank
    COMPUTATIONAL SCIENCE - ICCS 2020, PT VI, 2020, 12142 : 576 - 590
  • [9] Study on the capacity of hopfield neural networks
    School of Electronics and Information Engineering, Harbin Institute of Technology, Harbin, China
    Inf. Technol. J., 2008, 4 (684-688):
  • [10] Estimate of the Storage Capacity of q-Correlated Patterns in Hopfield Neural Networks
    Wedemann, Roseli S.
    Plastino, Angel R.
    Tsallis, Constantino
    Curado, Evaldo M. F.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT IV, 2024, 15019 : 137 - 150