Writing to the Hopfield Memory via Training a Recurrent Network

被引:1
|
作者
Bao, Han [1 ,2 ]
Zhang, Richong [1 ,2 ]
Mao, Yongyi [3 ]
Huai, Jinpeng [1 ,2 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, SKLSDE, Beijing, Peoples R China
[2] Beihang Univ, Beijing Adv Inst Big Data & Brain Comp, Beijing, Peoples R China
[3] Univ Ottawa, Sch Elect Engn & Comp Sci, Ottawa, ON, Canada
基金
中国国家自然科学基金;
关键词
Hopfield network; Writing protocol; Recurrent network; NEURAL-NETWORKS; CAPACITY;
D O I
10.1007/978-3-030-29911-8_19
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of writing on a Hopfield network. We cast the problem as a supervised learning problem by observing a simple link between the update equations of Hopfield network and recurrent neural networks. We compare the new writing protocol to existing ones and experimentally verify its effectiveness. Our method not only has a better ability of noise recovery, but also has a bigger capacity compared to the other existing writing protocols.
引用
收藏
页码:241 / 254
页数:14
相关论文
共 50 条