Deep learning incorporating biologically inspired neural dynamics and in-memory computing

被引:0
|
作者
Stanisław Woźniak
Angeliki Pantazi
Thomas Bohnstingl
Evangelos Eleftheriou
机构
[1] IBM Research – Zurich,
[2] Institute of Theoretical Computer Science,undefined
[3] Graz University of Technology,undefined
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Spiking neural networks (SNNs) incorporating biologically plausible neurons hold great promise because of their unique temporal dynamics and energy efficiency. However, SNNs have developed separately from artificial neural networks (ANNs), limiting the impact of deep learning advances for SNNs. Here, we present an alternative perspective of the spiking neuron that incorporates its neural dynamics into a recurrent ANN unit called a spiking neural unit (SNU). SNUs may operate as SNNs, using a step function activation, or as ANNs, using continuous activations. We demonstrate the advantages of SNU dynamics through simulations on multiple tasks and obtain accuracies comparable to, or better than, those of ANNs. The SNU concept enables an efficient implementation with in-memory acceleration for both training and inference. We experimentally demonstrate its efficacy for a music-prediction task in an in-memory-based SNN accelerator prototype using 52,800 phase-change memory devices. Our results open up an avenue for broad adoption of biologically inspired neural dynamics in challenging applications and acceleration with neuromorphic hardware.
引用
收藏
页码:325 / 336
页数:11
相关论文
共 50 条
  • [41] MOL-Based In-Memory Computing of Binary Neural Networks
    Ali, Khaled Alhaj
    Baghdadi, Amer
    Dupraz, Elsa
    Leonardon, Mathieu
    Rizk, Mostafa
    Diguet, Jean-Philippe
    [J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2022, 30 (07) : 869 - 880
  • [42] In-memory computing: characteristics, spintronics, and neural network applications insights
    Jangra, Payal
    Duhan, Manoj
    [J]. MULTISCALE AND MULTIDISCIPLINARY MODELING EXPERIMENTS AND DESIGN, 2024, : 5005 - 5029
  • [43] Accelerating Inference of Convolutional Neural Networks Using In-memory Computing
    Dazzi, Martino
    Sebastian, Abu
    Benini, Luca
    Eleftheriou, Evangelos
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2021, 15
  • [44] OxRRAM-Based Analog in-Memory Computing for Deep Neural Network Inference: A Conductance Variability Study
    Doevenspeck, J.
    Degraeve, R.
    Fantini, A.
    Cosemans, S.
    Mallik, A.
    Debacker, P.
    Verkest, D.
    Lauwereins, R.
    Dehaene, W.
    [J]. IEEE TRANSACTIONS ON ELECTRON DEVICES, 2021, 68 (05) : 2301 - 2305
  • [45] Deep ART Neural Model for Biologically Inspired Episodic Memory and Its Application to Task Performance of Robots
    Park, Gyeong-Moon
    Yoo, Yong-Ho
    Kim, Deok-Hwa
    Kim, Jong-Hwan
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2018, 48 (06) : 1786 - 1799
  • [46] An MRAM-based Deep In-Memory Architecture for Deep Neural Networks
    Patil, Ameya D.
    Hua, Haocheng
    Gonugondla, Sujan
    Kang, Mingu
    Shanbhag, Naresh R.
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2019,
  • [47] Monolithically Integrated RRAM- and CMOS-Based In-Memory Computing Optimizations for Efficient Deep Learning
    Yin, Shihui
    Kim, Yulhwa
    Han, Xu
    Barnaby, Hugh
    Yu, Shimeng
    Luo, Yandong
    He, Wangxin
    Sun, Xiaoyu
    Kim, Jae-Joon
    Seo, Jae-sun
    [J]. IEEE MICRO, 2019, 39 (06) : 54 - 63
  • [48] Efficient Discrete Temporal Coding Spike-Driven In-Memory Computing Macro for Deep Neural Network Based on Nonvolatile Memory
    Han, Lixia
    Huang, Peng
    Wang, Yijiao
    Zhou, Zheng
    Zhang, Yizhou
    Liu, Xiaoyan
    Kang, Jinfeng
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2022, 69 (11) : 4487 - 4498
  • [49] An Energy Efficient In-Memory Computing Machine Learning Classifier Scheme
    Jiang, Shixiong
    Priya, Sheena Ratnam
    Elango, Naveena
    Clay, James
    Sridhar, Ramalingam
    [J]. 2019 32ND INTERNATIONAL CONFERENCE ON VLSI DESIGN AND 2019 18TH INTERNATIONAL CONFERENCE ON EMBEDDED SYSTEMS (VLSID), 2019, : 157 - 162
  • [50] In-Memory Computing Architectures for Big Data and Machine Learning Applications
    Snasel, Vaclav
    Tran Khanh Dang
    Pham, Phuong N. H.
    Kueng, Josef
    Kong, Lingping
    [J]. FUTURE DATA AND SECURITY ENGINEERING. BIG DATA, SECURITY AND PRIVACY, SMART CITY AND INDUSTRY 4.0 APPLICATIONS, FDSE 2022, 2022, 1688 : 19 - 33