Energy-Efficient High-Accuracy Spiking Neural Network Inference Using Time-Domain Neurons

被引:7
|
作者
Song, Joonghyun [1 ]
Shin, Jiwon [1 ]
Kim, Hanseok [1 ,2 ]
Choi, Woo-Seok [1 ]
机构
[1] Seoul Natl Univ, Dept ECE, ISRC, Seoul, South Korea
[2] Samsung Elect, Hwaseong, South Korea
关键词
artificial neural network; spiking neural network; ANN-to-SNN conversion; integrate-and-fire neuron; time-domain signal processing;
D O I
10.1109/AICAS54282.2022.9870009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Due to the limitations of realizing artificial neural networks on prevalent von Neumann architectures, recent studies have presented neuromorphic systems based on spiking neural networks (SNNs) to reduce power and computational cost. However, conventional analog voltage-domain integrate-and-fire (I&F) neuron circuits, based on either current mirrors or op-amps, pose serious issues such as nonlinearity or high power consumption, thereby degrading either inference accuracy or energy efficiency of the SNN. To achieve excellent energy efficiency and high accuracy simultaneously, this paper presents a low-power highly linear time-domain I&F neuron circuit. Designed and simulated in a 28nm CMOS process, the proposed neuron leads to more than 4.3x lower error rate on the MNIST inference over the conventional current-mirror-based neurons. In addition, the power consumed by the proposed neuron circuit is simulated to be 0.230nW per neuron, which is orders of magnitude lower than the existing voltage-domain neurons.
引用
收藏
页码:5 / 8
页数:4
相关论文
共 50 条
  • [41] Energy-efficient optical network units for OFDM PON based on time-domain interleaved OFDM technique
    Hu, Xiaofeng
    Cao, Pan
    Zhang, Liang
    Jiang, Lipeng
    Su, Yikai
    OPTICS EXPRESS, 2014, 22 (11): : 13043 - 13049
  • [42] An Energy-Efficient Time Domain Based Compute In-Memory Architecture for Binary Neural Network
    Chakraborty, Subhradip
    Kushwaha, Dinesh
    Goel, Abhishek
    Singla, Anmol
    Bulusu, Anand
    Dasgupta, Sudeb
    2024 25TH INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN, ISQED 2024, 2024,
  • [43] A Little Energy Goes a Long Way: Build an Energy-Efficient, Accurate Spiking Neural Network From Convolutional Neural Network
    Wu, Dengyu
    Yi, Xinping
    Huang, Xiaowei
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [44] Energy-Efficient Photonic Spiking Neural Network on a monolithic silicon CMOS photonic platform
    Lee, Yun-Jhu
    On, Mehmet Berkay
    Xiao, Xian
    Ben Yoo, S. J.
    2021 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXPOSITION (OFC), 2021,
  • [45] An Energy-Efficient Graphene-based Spiking Neural Network Architecture for Pattern Recognition
    Laurenciu, Nicoleta Cucu
    Timmermans, Charles
    Cotofana, Sorin D.
    2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [46] A Hybrid Spiking Neural Network Reinforcement Learning Agent for Energy-Efficient Object Manipulation
    Oikonomou, Katerina Maria
    Kansizoglou, Ioannis
    Gasteratos, Antonios
    MACHINES, 2023, 11 (02)
  • [47] Spiking Neural Network Discovers Energy-Efficient Hexapod Motion in Deep Reinforcement Learning
    Naya, Katsumi
    Kutsuzawa, Kyo
    Owaki, Dai
    Hayashibe, Mitsuhiro
    IEEE ACCESS, 2021, 9 (09): : 150345 - 150354
  • [48] Area- and Energy-Efficient STDP Learning Algorithm for Spiking Neural Network SoC
    Kim, Giseok
    Kim, Kiryong
    Choi, Sara
    Jang, Hyo Jung
    Jung, Seong-Ook
    IEEE ACCESS, 2020, 8 : 216922 - 216932
  • [49] Training and Inference using Approximate Floating-Point Arithmetic for Energy Efficient Spiking Neural Network Processors
    Kwak, Myeongjin
    Lee, Jungwon
    Seo, Hyoju
    Sung, Mingyu
    Kim, Yongtae
    2021 INTERNATIONAL CONFERENCE ON ELECTRONICS, INFORMATION, AND COMMUNICATION (ICEIC), 2021,
  • [50] The Perfect Match: Selecting Approximate Multipliers for Energy-Efficient Neural Network Inference
    Spantidi, Ourania
    Anagnostopoulos, Iraklis
    2023 IEEE 24TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE SWITCHING AND ROUTING, HPSR, 2023,