Simultaneous embedding of multiple attractor manifolds in a recurrent neural network using constrained gradient optimization

被引:0
|
作者
Agmon, Haggai [1 ,2 ]
Burak, Yoram [1 ]
机构
[1] Hebrew Univ Jerusalem, Jerusalem, Israel
[2] Stanford Univ, Stanford, CA 94305 USA
基金
欧洲研究理事会; 以色列科学基金会;
关键词
PARAMETRIC WORKING-MEMORY; PATH-INTEGRATION; DYNAMICS; MODEL; ORIENTATION; REPRESENTATION; HIPPOCAMPUS; CELLS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The storage of continuous variables in working memory is hypothesized to be sustained in the brain by the dynamics of recurrent neural networks (RNNs) whose steady states form continuous manifolds. In some cases, it is thought that the synaptic connectivity supports multiple attractor manifolds, each mapped to a different context or task. For example, in hippocampal area CA3, positions in distinct environments are represented by distinct sets of population activity patterns, each forming a continuum. It has been argued that the embedding of multiple continuous attractors in a single RNN inevitably causes detrimental interference: quenched noise in the synaptic connectivity disrupts the continuity of each attractor, replacing it by a discrete set of steady states that can be conceptualized as lying on local minima of an abstract energy landscape. Consequently, population activity patterns exhibit systematic drifts towards one of these discrete minima, thereby degrading the stored memory over time. Here we show that it is possible to dramatically attenuate these detrimental interference effects by adjusting the synaptic weights. Synaptic weight adjustments are derived from a loss function that quantifies the roughness of the energy landscape along each of the embedded attractor manifolds. By minimizing this loss function, the stability of states can be dramatically improved, without compromising the capacity.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A Nonfeasible Gradient Projection Recurrent Neural Network for Equality-Constrained Optimization Problems
    Barbarosou, Maria P.
    Maratos, Nicholas G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (10): : 1665 - 1677
  • [2] Non-feasible gradient projection recurrent neural network for equality constrained optimization.
    Barbarosou, M
    Maratos, NG
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2251 - 2256
  • [3] Learning the trajectories of periodic attractor using recurrent neural network
    School of Electronic and Information Engineering, Dalian University of Technology, Dalian 116023, China
    不详
    Kong Zhi Li Lun Yu Ying Yong, 2006, 4 (497-502):
  • [4] Modified Gradient Projection Neural Network for Multiset Constrained Optimization
    Ying, Liufu
    Long, Jin
    Shuai, Li
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (09) : 9413 - 9423
  • [5] Stability of simultaneous recurrent neural network dynamics for static optimization
    Serpen, G
    Xu, YF
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 2023 - 2028
  • [6] A one-layer recurrent neural network for constrained nonconvex optimization
    Li, Guocheng
    Yan, Zheng
    Wang, Jun
    NEURAL NETWORKS, 2015, 61 : 10 - 21
  • [7] A Recurrent Neural Network Approach for Constrained Distributed Fuzzy Convex Optimization
    Liu, Jingxin
    Liao, Xiaofeng
    Dong, Jin-Song
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9743 - 9757
  • [8] Codon Optimization Using a Recurrent Neural Network
    Goulet, Dennis R.
    Yan, Yongqi
    Agrawal, Palak
    Waight, Andrew B.
    Mak, Amanda Nga-sze
    Zhu, Yi
    JOURNAL OF COMPUTATIONAL BIOLOGY, 2023, 30 (01) : 70 - 81
  • [9] A One-Layer Recurrent Neural Network for Constrained Nonsmooth Optimization
    Liu, Qingshan
    Wang, Jun
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2011, 41 (05): : 1323 - 1333
  • [10] Robust Pattern Recognition Using Chaotic Dynamics in Attractor Recurrent Neural Network
    Azarpour, M.
    Seyyedsalehi, S. A.
    Taherkhani, A.
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,