Neural inhibition for continual learning and memory

被引:22
|
作者
Barron, Helen C. [1 ,2 ]
机构
[1] Univ Oxford, Med Res Council, Brain Network Dynam Unit, Nuffield Dept Clin Neurosci, Mansfield Rd, Oxford OX1 3TH, England
[2] Univ Oxford, John Radcliffe Hosp, Wellcome Ctr Integrat Neuroimaging, FMRIB, Oxford OX3 9DU, England
基金
英国医学研究理事会; 英国惠康基金; 英国生物技术与生命科学研究理事会;
关键词
DIRECT-CURRENT STIMULATION; PRIMARY MOTOR CORTEX; GABA CONCENTRATION; DISINHIBITORY MICROCIRCUIT; GAMMA-OSCILLATIONS; PLASTICITY; INTERNEURONS; EXCITABILITY; MODULATION; HIPPOCAMPAL;
D O I
10.1016/j.conb.2020.09.007
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Humans are able to continually learn new information and acquire skills that meet the demands of an ever-changing environment. Yet, this new learning does not necessarily occur at the expense of old memories. The specialised biological mechanisms that permit continual learning in humans and other mammals are not fully understood. Here I explore the possibility that neural inhibition plays an important role. I present recent findings from studies in humans that suggest inhibition regulates the stability of neural networks to gate cortical plasticity and memory retrieval. These studies use non-invasive methods to obtain an indirect measure of neural inhibition and corroborate comparable findings in animals. Together these studies reveal a model whereby neural inhibition protects memories from interference to permit continual learning. Neural inhibition may, therefore, play a critical role in the computations that underlie higher-order cognition and adaptive behaviour.
引用
收藏
页码:85 / 94
页数:10
相关论文
共 50 条
  • [1] Convolutional Neural Network With Developmental Memory for Continual Learning
    Park, Gyeong-Moon
    Yoo, Sahng-Min
    Kim, Jong-Hwan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2691 - 2705
  • [2] Memory Bounds for Continual Learning
    Chen, Xi
    Papadimitriou, Christos
    Peng, Binghui
    [J]. 2022 IEEE 63RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2022, : 519 - 530
  • [3] Bilateral Memory Consolidation for Continual Learning
    Nie, Xing
    Xu, Shixiong
    Liu, Xiyan
    Meng, Gaofeng
    Huo, Chunlei
    Xiang, Shiming
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16026 - 16035
  • [4] Online continual learning with declarative memory
    Xiao, Zhe
    Du, Zhekai
    Wang, Ruijin
    Gan, Ruimeng
    Li, Jingjing
    [J]. NEURAL NETWORKS, 2023, 163 : 146 - 155
  • [5] Memory Efficient Continual Learning with Transformers
    Ermis, Beyza
    Zappella, Giovanni
    Wistuba, Martin
    Rawal, Aditya
    Archambeau, Cedric
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] Gradient Episodic Memory for Continual Learning
    Lopez-Paz, David
    Ranzato, Marc'Aurelio
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [7] Condensed Composite Memory Continual Learning
    Wiewe, Felix
    Yan, Bin
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [8] Memory Enhanced Replay for Continual Learning
    Xu, Guixun
    Guo, Wenhui
    Wang, Yanjiang
    [J]. 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 218 - 222
  • [9] Continual Learning with Neural Networks: A Review
    Awasthi, Abhijeet
    Sarawagi, Sunita
    [J]. PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 362 - 365
  • [10] Rainbow Memory: Continual Learning with a Memory of Diverse Samples
    Bang, Jihwan
    Kim, Heesu
    Yoo, YoungJoon
    Ha, Jung-Woo
    Choi, Jonghyun
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 8214 - 8223