HYBRID SPIKING NEURAL NETWORKS FINE-TUNING FOR HIPPOCAMPUS SEGMENTATION

被引:0
|
作者
Yue, Ye [1 ]
Baltes, Marc [1 ]
Abujahar, Nidal [1 ]
Sun, Tao [1 ]
Smith, Charles D. [2 ]
Bihl, Trevor [3 ]
Liu, Jundong [1 ]
机构
[1] Ohio Univ, Sch Elect Engn & Comp Sci, Athens, OH 45701 USA
[2] Univ Kentucky, Dept Neurol, Lexington, KY USA
[3] Wright State Univ, Dept Biomed, Ind & Human Factors Engn, Dayton, OH 45435 USA
关键词
Spiking neural network; image segmentation; hippocampus; brain; U-Net; ANN-SNN conversion;
D O I
10.1109/ISBI53787.2023.10230610
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the past decade, artificial neural networks (ANNs) have made tremendous advances, in part due to the increased availability of annotated data. However, ANNs typically require significant power and memory consumptions to reach their full potential. Spiking neural networks (SNNs) have recently emerged as a low-power alternative to ANNs due to their sparsity nature. SNN, however, are not as easy to train as ANNs. In this work, we propose a hybrid SNN training scheme and apply it to segment human hippocampi from magnetic resonance images. Our approach takes ANN-SNN conversion as an initialization step and relies on spike-based backpropagation to fine-tune the network. Compared with the conversion and direct training solutions, our method has advantages in both segmentation accuracy and training efficiency. Experiments demonstrate the effectiveness of our model in achieving the design goals.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Spiking neural networks fine-tuning for brain image segmentation
    Yue, Ye
    Baltes, Marc
    Abuhajar, Nidal
    Sun, Tao
    Karanth, Avinash
    Smith, Charles D.
    Bihl, Trevor
    Liu, Jundong
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [2] Fine-Tuning Surrogate Gradient Learning for Optimal Hardware Performance in Spiking Neural Networks
    Aliyev, Ilkin
    Adegbija, Tosiron
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [3] Fine-Tuning and the Stability of Recurrent Neural Networks
    MacNeil, David
    Eliasmith, Chris
    PLOS ONE, 2011, 6 (09):
  • [4] Fine-tuning and Visualization of Convolutional Neural Networks
    Yin, Xiangnan
    Chen, Weihai
    Wu, Xingming
    Yue, Haosong
    PROCEEDINGS OF THE 2017 12TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2017, : 1310 - 1315
  • [5] An efficient pruning and fine-tuning method for deep spiking neural network
    L. W. Meng
    G. C. Qiao
    X. Y. Zhang
    J. Bai
    Y. Zuo
    P. J. Zhou
    Y. Liu
    S. G. Hu
    Applied Intelligence, 2023, 53 : 28910 - 28923
  • [6] An efficient pruning and fine-tuning method for deep spiking neural network
    Meng, L. W.
    Qiao, G. C.
    Zhang, X. Y.
    Bai, J.
    Zuo, Y.
    Zhou, P. J.
    Liu, Y.
    Hu, S. G.
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28910 - 28923
  • [7] Fine-tuning Convolutional Neural Networks for fine art classification
    Cetinic, Eva
    Lipic, Tomislav
    Grgic, Sonja
    EXPERT SYSTEMS WITH APPLICATIONS, 2018, 114 : 107 - 118
  • [8] A study on training fine-tuning of convolutional neural networks
    Cai, Zhicheng
    Peng, Chenglei
    2021 13TH INTERNATIONAL CONFERENCE ON KNOWLEDGE AND SMART TECHNOLOGY (KST-2021), 2021, : 84 - 89
  • [9] Improved Regularization and Robustness for Fine-tuning in Neural Networks
    Li, Dongyue
    Zhang, Hongyang R.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [10] Fine-tuning with local learning rules helps to compress and accelerate spiking neural networks without accuracy loss
    D. V. Nekhaev
    V. A. Demin
    Neural Computing and Applications, 2022, 34 : 20687 - 20700