Hebbian dreaming for small datasets

被引:3
|
作者
Agliari, Elena [1 ]
Alemanno, Francesco [4 ]
Aquaro, Miriam [1 ]
Barra, Adriano [4 ]
Durante, Fabrizio [3 ]
Kanter, Ido [2 ]
机构
[1] Sapienza Univ Roma, Dept Math, Rome, Italy
[2] Bar Ilan Univ, Dept Phys, Ramat Gan, Israel
[3] Univ Salento, Dept Econ Sci, Lecce, Italy
[4] Univ Salento, Dept Math & Phys, Lecce, Italy
关键词
Hebbian learning; Sleeping phenomena; Statistical mechanics; Hopfield model; INFORMATION-STORAGE; MEMORY; RETRIEVAL; PATTERNS; SLEEP;
D O I
10.1016/j.neunet.2024.106174
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The dreaming Hopfield model constitutes a generalization of the Hebbian paradigm for neural networks, that is able to perform on -line learning when "awake"and also to account for off -line "sleeping"mechanisms. The latter have been shown to enhance storing in such a way that, in the long sleep-time limit, this model can reach the maximal storage capacity achievable by networks equipped with symmetric pairwise interactions. In this paper, we inspect the minimal amount of information that must be supplied to such a network to guarantee a successful generalization, and we test it both on random synthetic and on standard structured datasets (������.������., MNIST, Fashion-MNIST and Olivetti). By comparing these minimal thresholds of information with those required by the standard (������.������., always "awake") Hopfield model, we prove that the present network can save up to similar to 90% of the dataset size, yet preserving the same performance of the standard counterpart. This suggests that sleep may play a pivotal role in explaining the gap between the large volumes of data required to train artificial neural networks and the relatively small volumes needed by their biological counterparts. Further, we prove that the model Cost function (typically used in statistical mechanics) admits a representation in terms of a standard Loss function (typically used in machine learning) and this allows us to analyze its emergent computational skills both theoretically and computationally: a quantitative picture of its capabilities as a function of its control parameters is achieved and consistency between the two approaches is highlighted. The resulting network is an associative memory for pattern recognition tasks that learns from examples on -line, generalizes correctly (in suitable regions of its control parameters) and optimizes its storage capacity by off -line sleeping: such a reduction of the training cost can be inspiring toward sustainable AI and in situations where data are relatively sparse.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [1] Starting Small, Dreaming Big
    Woodson, Mark W.
    CIVIL ENGINEERING, 2015, 85 (11): : 12 - 12
  • [2] DREAMING OF A SMALL CALL FOR HELP
    Barger, John Wall
    DALHOUSIE REVIEW, 2022, 102 (01) : 95 - 96
  • [3] From Small Beginnings to Dreaming Big
    Mattei, Norma Jean
    CIVIL ENGINEERING, 2017, 87 (09): : 12 - 12
  • [4] AUDIOVERIFICATION OF SMALL DATASETS
    MACLURE, M
    BATES, B
    SCHERR, P
    AMERICAN JOURNAL OF EPIDEMIOLOGY, 1983, 118 (05) : 779 - 780
  • [5] Dreaming during REM sleep: autobiographically meaningful or simply a reflection of a Hebbian-based memory consolidation process?
    Voss, U.
    Klimke, A.
    ARCHIVES ITALIENNES DE BIOLOGIE, 2018, 156 (03): : 99 - 111
  • [6] Nanotechnology in neurosurgery: thinking small, dreaming big
    Tan, Aaron
    Jeyaraj, Rebecca
    Ashkan, Keyoumars
    BRITISH JOURNAL OF NEUROSURGERY, 2017, 31 (05) : 538 - 550
  • [7] Transformers Meet Small Datasets
    Shao, Ran
    Bi, Xiao-Jun
    IEEE ACCESS, 2022, 10 : 118454 - 118464
  • [8] Conditional GAN for Small Datasets
    Hiruta, Komei
    Saito, Ryusuke
    Hatakeyama, Taro
    Hashimoto, Atsushi
    Kurihara, Satoshi
    2022 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM), 2022, : 278 - 281
  • [9] Designing for small and large datasets
    TULP Interactive, The Hague, Netherlands
    New Challenges for Data Design, (377-390):
  • [10] Uncertainty evaluations from small datasets
    Stoudt, Sara
    Pintar, Adam
    Possolo, Antonio
    METROLOGIA, 2021, 58 (01)