Fine-tuning with local learning rules helps to compress and accelerate spiking neural networks without accuracy loss

被引:0
|
作者
D. V. Nekhaev
V. A. Demin
机构
[1] National Research Center “Kurchatov Institute”,
来源
关键词
Spiking neural networks; FEELING local learning rules; Inhibitory connections; Weight matrix decomposition; Classification;
D O I
暂无
中图分类号
学科分类号
摘要
Spiking neural networks (SNNs) are believed to be highly energy- and computationally efficient machine learning algorithms, especially when implemented on neuromorphic hardware. Some recent studies have revealed that lateral (intralayer) inhibitory connectivity is necessary for effective and stable learning of SNNs. However, for large-scale SNNs, lateral inhibitory connections require an additional large amount of calculations. This negatively affects both the SNN inference time and the size of required computing resources. In this study, we propose a fine-tuning procedure using original local learning rules, called FEELING, to be applied to the weights of interneuron sublayer, which is introduced for organizing more efficient competition between excitatory neurons. At the same time, the initialization of interneuron weight values is implemented by singular value decomposition of intralayer inhibitory weight matrix characterizing the original SNN architecture before optimization. The proposed procedure allows to compress and accelerate large-scale SNNs with lateral inhibition in the layers, alongside with maintenance of their recognition accuracy, as it is shown on MNIST dataset. We demonstrate that this new optimization technique is superior to simple pruning of inhibitory connections, even also followed by fine-tuning. Moreover, this method of fine-tuned decomposition suggests the association of excitatory and inhibitory functions to two different sublayers of neurons, as it is naturally observed in a biological neural system. We hope that findings of this study not only reveal some new aspects of the effective computation and bio-plausible architecture for SNNs but also assume a hypothetic reason for the evolutionary preference of inhibitory neurons over inhibitory connections.
引用
收藏
页码:20687 / 20700
页数:13
相关论文
共 18 条
  • [1] Fine-tuning with local learning rules helps to compress and accelerate spiking neural networks without accuracy loss
    Nekhaev, D., V
    Demin, V. A.
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (23): : 20687 - 20700
  • [2] Spiking neural networks fine-tuning for brain image segmentation
    Yue, Ye
    Baltes, Marc
    Abuhajar, Nidal
    Sun, Tao
    Karanth, Avinash
    Smith, Charles D.
    Bihl, Trevor
    Liu, Jundong
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [3] HYBRID SPIKING NEURAL NETWORKS FINE-TUNING FOR HIPPOCAMPUS SEGMENTATION
    Yue, Ye
    Baltes, Marc
    Abujahar, Nidal
    Sun, Tao
    Smith, Charles D.
    Bihl, Trevor
    Liu, Jundong
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [4] Fine-Tuning Surrogate Gradient Learning for Optimal Hardware Performance in Spiking Neural Networks
    Aliyev, Ilkin
    Adegbija, Tosiron
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [5] Fine-Tuning Deep Neural Networks in Continuous Learning Scenarios
    Kaeding, Christoph
    Rodner, Erik
    Freytag, Alexander
    Denzler, Joachim
    COMPUTER VISION - ACCV 2016 WORKSHOPS, PT III, 2017, 10118 : 588 - 605
  • [6] Rollback Ensemble With Multiple Local Minima in Fine-Tuning Deep Learning Networks
    Ro, Youngmin
    Choi, Jongwon
    Heo, Byeongho
    Choi, Jin Young
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (09) : 4648 - 4660
  • [7] Trunk Pruning: Highly Compatible Channel Pruning for Convolutional Neural Networks Without Fine-Tuning
    Kim, Nam Joon
    Kim, Hyun
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 5588 - 5599
  • [8] EFFICIENT FINE-TUNING OF NEURAL NETWORKS FOR ARTIFACT REMOVAL IN DEEP LEARNING FOR INVERSE IMAGING PROBLEMS
    Lucas, Alice
    Lopez-Tapia, Santiago
    Molina, Rafael
    Katsaggelos, Aggelos K.
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 3591 - 3595
  • [9] Deep learning with cinematic rendering: fine-tuning deep neural networks using photorealistic medical images
    Mahmood, Faisal
    Chen, Richard
    Sudarsky, Sandra
    Yu, Daphne
    Durr, Nicholas J.
    PHYSICS IN MEDICINE AND BIOLOGY, 2018, 63 (18):
  • [10] Targeted Gradient Descent: A Novel Method for Convolutional Neural Networks Fine-Tuning and Online-Learning
    Chen, Junyu
    Asma, Evren
    Chan, Chung
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT III, 2021, 12903 : 25 - 35