Dropout in Neural Networks Simulates the Paradoxical Effects of Deep Brain Stimulation on Memory

被引:10
|
作者
Tan, Shawn Zheng Kai [1 ]
Du, Richard [2 ]
Perucho, Jose Angelo Udal [2 ]
Chopra, Shauhrat S. [3 ]
Vardhanabhuti, Varut [2 ]
Lim, Lee Wei [1 ]
机构
[1] Univ Hong Kong, Sch Biomed Sci, Li Ka Shing Fac Med, Neuromodulat Lab, Hong Kong, Peoples R China
[2] Univ Hong Kong, Li Ka Shing Fac Med, Dept Diagnost Radiol, Hong Kong, Peoples R China
[3] City Univ Hong Kong, Sch Energy & Environm, Hong Kong, Peoples R China
来源
FRONTIERS IN AGING NEUROSCIENCE | 2020年 / 12卷
关键词
neuromodulation; deep brain stimulation; memory; neural network; dropout; ANTERIOR THALAMUS; RECEPTIVE FIELDS; PLASTICITY;
D O I
10.3389/fnagi.2020.00273
中图分类号
R592 [老年病学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 100203 ;
摘要
Neuromodulation techniques such as deep brain stimulation (DBS) are a promising treatment for memory-related disorders including anxiety, addiction, and dementia. However, the outcomes of such treatments appear to be somewhat paradoxical, in that these techniques can both disrupt and enhance memory even when applied to the same brain target. In this article, we hypothesize that disruption and enhancement of memory through neuromodulation can be explained by the dropout of engram nodes. We used a convolutional neural network (CNN) to classify handwritten digits and letters and applied dropout at different stages to simulate DBS effects on engrams. We showed that dropout applied during training improved the accuracy of prediction, whereas dropout applied during testing dramatically decreased the accuracy of prediction, which mimics enhancement and disruption of memory, respectively. We further showed that transfer learning of neural networks with dropout had increased the accuracy and rate of learning. Dropout during training provided a more robust "skeleton" network and, together with transfer learning, mimicked the effects of chronic DBS on memory. Overall, we showed that the dropout of engram nodes is a possible mechanism by which neuromodulation techniques such as DBS can both disrupt and enhance memory, providing a unique perspective on this paradox.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] The Paradoxical Effect of Deep Brain Stimulation on Memory
    Tan, Shawn Zheng Kai
    Fung, Man-Lung
    Koh, Junhao
    Chan, Ying-Shing
    Lim, Lee Wei
    AGING AND DISEASE, 2020, 11 (01): : 179 - 190
  • [2] Deep brain stimulation effects on memory
    Laxton, A. W.
    Sankar, T.
    Lozano, A. M.
    Hamani, C.
    JOURNAL OF NEUROSURGICAL SCIENCES, 2012, 56 (04) : 341 - 344
  • [3] Deep brain stimulation using neural networks
    Gomez, JF
    NEURAL NETWORKS, 2000, 13 (07) : 1 - 2
  • [4] Selective Dropout for Deep Neural Networks
    Barrow, Erik
    Eastwood, Mark
    Jayne, Chrisina
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT III, 2016, 9949 : 519 - 528
  • [5] Deep Neural Networks for Context-Dependent Deep Brain Stimulation
    Haddock, Andrew
    Chizeck, Howard J.
    Ko, Andrew L.
    2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER), 2019, : 957 - 960
  • [6] Variational Dropout Sparsifies Deep Neural Networks
    Molchanov, Dmitry
    Ashukha, Arsenii
    Vetrov, Dmitry
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [7] Dropout Rademacher complexity of deep neural networks
    Wei GAO
    Zhi-Hua ZHOU
    Science China(Information Sciences), 2016, 59 (07) : 173 - 184
  • [8] Regularization of deep neural networks with spectral dropout
    Khan, Salman H.
    Hayat, Munawar
    Porikli, Fatih
    NEURAL NETWORKS, 2019, 110 : 82 - 90
  • [9] Dropout Rademacher complexity of deep neural networks
    Wei Gao
    Zhi-Hua Zhou
    Science China Information Sciences, 2016, 59
  • [10] Dropout Rademacher complexity of deep neural networks
    Gao, Wei
    Zhou, Zhi-Hua
    SCIENCE CHINA-INFORMATION SCIENCES, 2016, 59 (07)