CyNAPSE: A Low-power Reconfigurable Neural Inference Accelerator for Spiking Neural Networks

被引:2
|
作者
Saha, Saunak [1 ]
Duwe, Henry [1 ]
Zambreno, Joseph [1 ]
机构
[1] Iowa State Univ, Dept Elect & Comp Engn, Ames, IA 50011 USA
基金
美国国家科学基金会;
关键词
Neuromorphic; Spiking neural networks; Reconfigurable; Accelerator; Memory; Caching; Leakage; Energy efficiency; PROCESSOR; MODEL; ARCHITECTURE;
D O I
10.1007/s11265-020-01546-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
While neural network models keep scaling in depth and computational requirements, biologically accurate models are becoming more interesting for low-cost inference. Coupled with the need to bring more computation to the edge in resource-constrained embedded and IoT devices, specialized ultra-low power accelerators for spiking neural networks are being developed. Having a large variance in the models employed in these networks, these accelerators need to be flexible, user-configurable, performant and energy efficient. In this paper, we describe CyNAPSE, a fully digital accelerator designed to emulate neural dynamics of diverse spiking networks. Since the use case of our implementation is primarily concerned with energy efficiency, we take a closer look at the factors that could improve its energy consumption. We observe that while majority of its dynamic power consumption can be credited to memory traffic, its on-chip components suffer greatly from static leakage. Given that the event-driven spike processing algorithm is naturally memory-intensive and has a large number of idle processing elements, it makes sense to tackle each of these problems towards a more efficient hardware implementation. With a diverse set of network benchmarks, we incorporate a detailed study of memory patterns that ultimately informs our choice of an application-specific network-adaptive memory management strategy to reduce dynamic power consumption of the chip. Subsequently, we also propose and evaluate a leakage mitigation strategy for runtime control of idle power. Using both the RTL implementation and a software simulation of CyNAPSE, we measure the relative benefits of these undertakings. Results show that our adaptive memory management policy results in up to 22% more reduction in dynamic power consumption compared to conventional policies. The runtime leakage mitigation techniques show that up to 99.92% and at least 14% savings in leakage energy consumption is achievable in CyNAPSE hardware modules.
引用
收藏
页码:907 / 929
页数:23
相关论文
共 50 条
  • [41] A Bio-inspired Low-power Hybrid Analog/Digital Spiking Neural Networks for Pervasive Smart Cameras
    Hsieh, Yung-Ting
    Pompili, Dario
    2024 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS AND OTHER AFFILIATED EVENTS, PERCOM WORKSHOPS, 2024, : 678 - 683
  • [42] Ultra-Low Power Reconfigurable Synaptic and Neuronal Transistor for Spiking Neural Network
    Kamal, Neha
    Singh, Jawar
    Lahgere, Avinash
    Tiwari, Pramod Kumar
    IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2023, 22 : 245 - 251
  • [43] A Runtime-Reconfigurable Hardware Encoder for Spiking Neural Networks
    Alam, Sk Hasibul
    Foshie, Adam
    Rose, Garrett
    PROCEEDINGS OF THE GREAT LAKES SYMPOSIUM ON VLSI 2023, GLSVLSI 2023, 2023, : 203 - 206
  • [44] DSIP: A Scalable Inference Accelerator for Convolutional Neural Networks
    Jo, Jihyuck
    Cha, Soyoung
    Rho, Dayoung
    Park, In-Cheol
    IEEE JOURNAL OF SOLID-STATE CIRCUITS, 2018, 53 (02) : 605 - 618
  • [45] A Low-Power Radiation Detection SoC With Neural Network Accelerator for Radioisotope Identification
    Murray, Samuel J.
    Schmitz, Joseph A.
    Balkir, Sina
    Hoffman, Michael W.
    IEEE TRANSACTIONS ON NUCLEAR SCIENCE, 2023, 70 (03) : 272 - 285
  • [46] Approximate Logic Synthesis in the Loop for Designing Low-Power Neural Network Accelerator
    Qian, Yifan
    Meng, Chang
    Zhang, Yawen
    Qian, Weikang
    Wang, Runsheng
    Huang, Ru
    2021 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2021,
  • [47] Low-Power Appliance Recognition Using Recurrent Neural Networks
    Pratama, Azkario R.
    Simanjuntak, Frans J.
    Lazovik, Alexander
    Aiello, Marco
    APPLICATIONS OF INTELLIGENT SYSTEMS, 2018, 310 : 239 - 250
  • [48] Low-power synapse/neuron cell for artificial neural networks
    Division of Circuits and Systems, Sch. Elec. Electron. Eng., N., Singapore, Singapore
    不详
    Microelectron J, 12 (1261-1264):
  • [49] A low-power synapse/neuron cell for artificial neural networks
    Lau, KT
    Lee, ST
    Chan, PK
    MICROELECTRONICS JOURNAL, 1999, 30 (12) : 1261 - 1264
  • [50] Low-Power Online ECG Analysis Using Neural Networks
    Modarressi, Mehdi
    Yasoubi, Ali
    Modarressi, Maryam
    19TH EUROMICRO CONFERENCE ON DIGITAL SYSTEM DESIGN (DSD 2016), 2016, : 547 - 552