Hebbian dreaming for small datasets

被引:3
|
作者
Agliari, Elena [1 ]
Alemanno, Francesco [4 ]
Aquaro, Miriam [1 ]
Barra, Adriano [4 ]
Durante, Fabrizio [3 ]
Kanter, Ido [2 ]
机构
[1] Sapienza Univ Roma, Dept Math, Rome, Italy
[2] Bar Ilan Univ, Dept Phys, Ramat Gan, Israel
[3] Univ Salento, Dept Econ Sci, Lecce, Italy
[4] Univ Salento, Dept Math & Phys, Lecce, Italy
关键词
Hebbian learning; Sleeping phenomena; Statistical mechanics; Hopfield model; INFORMATION-STORAGE; MEMORY; RETRIEVAL; PATTERNS; SLEEP;
D O I
10.1016/j.neunet.2024.106174
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The dreaming Hopfield model constitutes a generalization of the Hebbian paradigm for neural networks, that is able to perform on -line learning when "awake"and also to account for off -line "sleeping"mechanisms. The latter have been shown to enhance storing in such a way that, in the long sleep-time limit, this model can reach the maximal storage capacity achievable by networks equipped with symmetric pairwise interactions. In this paper, we inspect the minimal amount of information that must be supplied to such a network to guarantee a successful generalization, and we test it both on random synthetic and on standard structured datasets (������.������., MNIST, Fashion-MNIST and Olivetti). By comparing these minimal thresholds of information with those required by the standard (������.������., always "awake") Hopfield model, we prove that the present network can save up to similar to 90% of the dataset size, yet preserving the same performance of the standard counterpart. This suggests that sleep may play a pivotal role in explaining the gap between the large volumes of data required to train artificial neural networks and the relatively small volumes needed by their biological counterparts. Further, we prove that the model Cost function (typically used in statistical mechanics) admits a representation in terms of a standard Loss function (typically used in machine learning) and this allows us to analyze its emergent computational skills both theoretically and computationally: a quantitative picture of its capabilities as a function of its control parameters is achieved and consistency between the two approaches is highlighted. The resulting network is an associative memory for pattern recognition tasks that learns from examples on -line, generalizes correctly (in suitable regions of its control parameters) and optimizes its storage capacity by off -line sleeping: such a reduction of the training cost can be inspiring toward sustainable AI and in situations where data are relatively sparse.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [31] A repeat sales index robust to small datasets
    Baroni, Michel
    Barthelemy, Fabrice
    Mokrane, Mahdi
    JOURNAL OF PROPERTY INVESTMENT & FINANCE, 2011, 29 (01) : 35 - +
  • [32] Swin MAE: Masked autoencoders for small datasets
    Xu, Zi'an
    Dai, Yin
    Liu, Fayu
    Chen, Weibing
    Liu, Yue
    Shi, Lifu
    Liu, Sheng
    Zhou, Yuhang
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 161
  • [33] Deep Knowledge Tracing on Skills with Small Datasets
    Tato, Ange
    Nkambou, Roger
    INTELLIGENT TUTORING SYSTEMS, ITS 2022, 2022, 13284 : 123 - 135
  • [34] A machine learning approach for corrosion small datasets
    Sutojo, Totok
    Rustad, Supriadi
    Akrom, Muhamad
    Syukur, Abdul
    Shidik, Guruh Fajar
    Dipojono, Hermawan Kresno
    NPJ MATERIALS DEGRADATION, 2023, 7 (01)
  • [35] Image Classification With Small Datasets: Overview and Benchmark
    Brigato, Lorenzo
    Barz, Bjoern
    Iocchi, Luca
    Denzler, Joachim
    IEEE ACCESS, 2022, 10 : 49233 - 49250
  • [36] PaSTeL : Parallel Runtime and Algorithms for Small Datasets
    Videau, Brice
    Saule, Erik
    Mehaut, Jean-Francois
    CISIS: 2009 INTERNATIONAL CONFERENCE ON COMPLEX, INTELLIGENT AND SOFTWARE INTENSIVE SYSTEMS, VOLS 1 AND 2, 2009, : 651 - +
  • [37] Transcriptional and epigenetic regulation of Hebbian and non-Hebbian plasticity
    Guzman-Karlsson, Mikael C.
    Meadows, Jarrod P.
    Gavin, Cristin F.
    Hablitz, John J.
    Sweatt, J. David
    NEUROPHARMACOLOGY, 2014, 80 : 3 - 17
  • [38] APLYSIA - HEBBIAN OR NOT
    FREGNAC, Y
    TRENDS IN NEUROSCIENCES, 1986, 9 (09) : 410 - 411
  • [39] Comparing methods of analysing datasets with small clusters: case studies using four paediatric datasets
    Marston, Louise
    Peacock, Janet L.
    Yu, Keming
    Brocklehurst, Peter
    Calvert, Sandra A.
    Greenough, Anne
    Marlow, Neil
    PAEDIATRIC AND PERINATAL EPIDEMIOLOGY, 2009, 23 (04) : 380 - 392
  • [40] Hebbian and anti-Hebbian learning for independent component analysis
    Meyer-Bäse, A
    Chen, YM
    McCullough, S
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 920 - 925