On the equivalence of Hopfield networks and Boltzmann Machines

被引:90
|
作者
Barra, Adriano [1 ]
Bernacchia, Alberto [2 ]
Santucci, Enrica [3 ]
Contucci, Pierluigi [4 ]
机构
[1] Univ Roma La Sapienza, Dipartimento Fis, I-00185 Rome, Italy
[2] Yale Univ, Dept Neurobiol, New Haven, CT 06510 USA
[3] Univ Aquila, Dipartimento Matemat, I-67010 Laquila, Italy
[4] Alma Mater Studiorum Univ Bologna, Dipartimento Matemat, I-40126 Bologna, Italy
关键词
Statistical mechanics; Hopfield networks; Boltzmann Machines; NEURAL-NETWORKS; IMAGES; MEMORY;
D O I
10.1016/j.neunet.2012.06.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A specific type of neural networks, the Restricted Boltzmann Machines (RBM), are implemented for classification and feature detection in machine learning. They are characterized by separate layers of visible and hidden units, which are able to learn efficiently a generative model of the observed data. We study a "hybrid" version of RBMs, in which hidden units are analog and visible units are binary, and we show that thermodynamics of visible units are equivalent to those of a Hopfield network, in which the N visible units are the neurons and the P hidden units are the learned patterns. We apply the method of stochastic stability to derive the thermodynamics of the model, by considering a formal extension of this technique to the case of multiple sets of stored patterns, which may act as a benchmark for the study of correlated sets. Our results imply that simulating the dynamics of a Hopfield network, requiring the update of N neurons and the storage of N (N - 1)/2 synapses, can be accomplished by a hybrid Boltzmann Machine, requiring the update of N P neurons but the storage of only NP synapses. In addition, the well known glass transition of the Hopfield network has a counterpart in the Boltzmann Machine: it corresponds to an optimum criterion for selecting the relative sizes of the hidden and visible layers, resolving the trade-off between flexibility and generality of the model. The low storage phase of the Hopfield model corresponds to few hidden units and hence a overly constrained RBM, while the spin-glass phase (too many hidden units) corresponds to unconstrained RBM prone to overfitting of the observed data. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 9
页数:9
相关论文
共 50 条
  • [1] Attention in a Family of Boltzmann Machines Emerging From Modern Hopfield Networks
    Ota, Toshihiro
    Karakida, Ryo
    NEURAL COMPUTATION, 2023, 35 (08) : 1463 - 1480
  • [2] Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks
    Marullo, Chiara
    Agliari, Elena
    ENTROPY, 2021, 23 (01) : 1 - 16
  • [3] Parallel retrieval of correlated patterns: From Hopfield networks to Boltzmann machines
    Agliari, Elena
    Barra, Adriano
    De Antoni, Andrea
    Galluzzi, Andrea
    NEURAL NETWORKS, 2013, 38 : 52 - 63
  • [4] Phase diagram of restricted Boltzmann machines and generalized Hopfield networks with arbitrary priors
    Barra, Adriano
    Genovese, Giuseppe
    Sollich, Peter
    Tantari, Daniele
    PHYSICAL REVIEW E, 2018, 97 (02)
  • [5] Equivalence of restricted Boltzmann machines and tensor network states
    Chen, Jing
    Cheng, Song
    Xie, Haidong
    Wang, Lei
    Xiang, Tao
    PHYSICAL REVIEW B, 2018, 97 (08)
  • [6] On the effective initialisation for restricted Boltzmann machines via duality with Hopfield model
    Leonelli, Francesca Elisa
    Agliari, Elena
    Albanese, Linda
    Barra, Adriano
    NEURAL NETWORKS, 2021, 143 : 314 - 326
  • [7] Multilayer perceptrons, Hopfield’s associative memories, and restricted Boltzmann machines
    Shin-ichi Asakawa
    BMC Neuroscience, 15 (Suppl 1)
  • [8] ASYMMETRIC PARALLEL BOLTZMANN MACHINES ARE BELIEF NETWORKS
    NEAL, RM
    NEURAL COMPUTATION, 1992, 4 (06) : 832 - 834
  • [9] On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization
    Uykan, Zekeriya
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3294 - 3304
  • [10] Training deep Boltzmann networks with sparse Ising machines
    Niazi, Shaila
    Chowdhury, Shuvro
    Aadit, Navid Anjum
    Mohseni, Masoud
    Qin, Yao
    Camsari, Kerem Y.
    NATURE ELECTRONICS, 2024, 7 (07): : 610 - 619