Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware

被引:14
|
作者
Knight, James C. [1 ]
Tully, Philip J. [2 ,3 ,4 ]
Kaplan, Bernhard A. [5 ]
Lansner, Anders [2 ,3 ,6 ]
Furber, Steve B. [1 ]
机构
[1] Univ Manchester, Sch Comp Sci, Adv Processor Technol Grp, Manchester, Lancs, England
[2] Royal Inst Technol, Dept Computat Biol, Stockholm, Sweden
[3] Karolinska Inst, Stockholm Brain Inst, Stockholm, Sweden
[4] Univ Edinburgh, Sch Informat, Inst Adapt & Neural Computat, Edinburgh, Midlothian, Scotland
[5] Zuse Inst Berlin, Dept Visualizat & Data Anal, Berlin, Germany
[6] Stockholm Univ, Dept Numer Anal & Comp Sci, S-10691 Stockholm, Sweden
来源
FRONTIERS IN NEUROANATOMY | 2016年 / 10卷
基金
欧洲研究理事会; 英国工程与自然科学研究理事会; 欧盟第七框架计划;
关键词
SpiNNaker; learning; plasticity; digital neuromorphic hardware; Bayesian confidence propagation neural network (BCPNN); event-driven simulation; fixed-point accuracy; TIMING-DEPENDENT PLASTICITY; VISUAL-CORTEX; MOTOR CORTEX; ATTRACTOR DYNAMICS; SENSORY PREDICTION; PYRAMIDAL NEURONS; CORTICAL ACTIVITY; FIRE MODEL; CELL-TYPE; NEOCORTEX;
D O I
10.3389/fnana.2016.00037
中图分类号
R602 [外科病理学、解剖学]; R32 [人体形态学];
学科分类号
100101 ;
摘要
SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 x 10(4) neurons and 5.1 x 10(7) plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45x more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Large-Scale Spiking Neural Networks using Neuromorphic Hardware Compatible Models
    Krichmar, Jeffrey L.
    Coussy, Philippe
    Dutt, Nikil
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2015, 11 (04)
  • [2] A Wafer-Scale Neuromorphic Hardware System for Large-Scale Neural Modeling
    Schemmel, Johannes
    Bruederle, Daniel
    Gruebl, Andreas
    Hock, Matthias
    Meier, Karlheinz
    Millner, Sebastian
    [J]. 2010 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, 2010, : 1947 - 1950
  • [3] Neuromorphic Array Communications Controller to Support Large-Scale Neural Networks
    Young, Aaron R.
    Dean, Mark E.
    Plank, James S.
    Rose, Garrett S.
    Schuman, Catherine D.
    [J]. 2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 123 - 130
  • [4] Neuromorphic silicon neurons and large-scale neural networks: challenges and opportunities
    Poon, Chi-Sang
    Zhou, Kuan
    [J]. FRONTIERS IN NEUROSCIENCE, 2011, 5
  • [5] A Hardware/Application Overlay Model for Large-Scale Neuromorphic Simulation
    Rast, Alexander
    Shahsavari, Mahyar
    Bragg, Graeme M.
    Vousden, Mark L.
    Thomas, David
    Brown, Andrew
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [6] Visualization of Large-Scale Neural Simulations
    Hernando, Juan B.
    Duelo, Carlos
    Martin, Vicente
    [J]. BRAIN-INSPIRED COMPUTING, 2014, 8603 : 184 - 197
  • [7] Coreset: Hierarchical neuromorphic computing supporting large-scale neural networks with improved resource efficiency
    Yang, Liwei
    Zhang, Huaipeng
    Luo, Tao
    Qu, Chuping
    Aung, Myat Thu Linn
    Cui, Yingnan
    Zhou, Jun
    Wong, Ming Ming
    Pu, Junran
    Do, Anh Tuan
    Goh, Rick Siow Mong
    Wong, Weng Fai
    [J]. NEUROCOMPUTING, 2022, 474 : 128 - 140
  • [8] Scaled-up Neuromorphic Array Communications Controller (SNACC) for Large-scale Neural Networks
    Young, Aaron R.
    Foshie, Adam Z.
    Dean, Mark E.
    Plank, James S.
    Rose, Garrett S.
    Mitchell, J. Parker
    Schuman, Catherine D.
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [9] NeuProMa: A Toolchain for Mapping Large-Scale Spiking Convolutional Neural Networks onto Neuromorphic Processor
    Xiao, Chao
    Chen, Jihua
    Wang, Lei
    [J]. NETWORK AND PARALLEL COMPUTING, NPC 2022, 2022, 13615 : 129 - 142
  • [10] Neuromorphic Modeling Abstractions and Simulation of Large-Scale Cortical Networks
    Krichmar, Jeffrey L.
    Dutt, Nikil
    Nageswaran, Jayram M.
    Richert, Micah
    [J]. 2011 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN (ICCAD), 2011, : 334 - 338