Effective Active Learning Method for Spiking Neural Networks

被引:0
|
作者
Xie, Xiurui [1 ]
Yu, Bei [1 ]
Liu, Guisong [2 ,3 ]
Zhan, Qiugang [1 ]
Tang, Huajin [4 ,5 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Southwestern Univ Finance & Econ, Sch Comp & Artificial Intelligence, Chengdu 611130, Peoples R China
[3] Univ Elect Sci & Technol China, Zhongshan Inst, Zhongshan 528400, Peoples R China
[4] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[5] Zhejiang Lab, Hangzhou 311122, Peoples R China
关键词
Biological system modeling; Neurons; Learning systems; Predictive models; Training; Task analysis; Integrated circuit modeling; Active learning method; deep learning; feature representation; spiking neural network (SNN);
D O I
10.1109/TNNLS.2023.3257333
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A large quantity of labeled data is required to train high-performance deep spiking neural networks (SNNs), but obtaining labeled data is expensive. Active learning is proposed to reduce the quantity of labeled data required by deep learning models. However, conventional active learning methods in SNNs are not as effective as that in conventional artificial neural networks (ANNs) because of the difference in feature representation and information transmission. To address this issue, we propose an effective active learning method for a deep SNN model in this article. Specifically, a loss prediction module ActiveLossNet is proposed to extract features and select valuable samples for deep SNNs. Then, we derive the corresponding active learning algorithm for deep SNN models. Comprehensive experiments are conducted on CIFAR-10, MNIST, Fashion-MNIST, and SVHN on different SNN frameworks, including seven-layer CIFARNet and 20-layer ResNet-18. The comparison results demonstrate that the proposed active learning algorithm outperforms random selection and conventional ANN active learning methods. In addition, our method converges faster than conventional active learning methods.
引用
收藏
页码:12373 / 12382
页数:10
相关论文
共 50 条
  • [1] Effective Transfer Learning Algorithm in Spiking Neural Networks
    Zhan, Qiugang
    Liu, Guisong
    Xie, Xiurui
    Sun, Guolin
    Tang, Huajin
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13323 - 13335
  • [2] A Curiosity-Based Learning Method for Spiking Neural Networks
    Shi, Mengting
    Zhang, Tielin
    Zeng, Yi
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [3] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    [J]. NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [4] Learning algorithm for spiking neural networks
    Amin, HH
    Fujii, RH
    [J]. ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 456 - 465
  • [5] Supervised learning with spiking neural networks
    Xin, JG
    Embrechts, MJ
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 1772 - 1777
  • [6] Federated Learning With Spiking Neural Networks
    Venkatesha, Yeshwanth
    Kim, Youngeun
    Tassiulas, Leandros
    Panda, Priyadarshini
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 6183 - 6194
  • [7] Bio-inspired Active Learning method in spiking neural network
    Zhan, Qiugang
    Liu, Guisong
    Xie, Xiurui
    Zhang, Malu
    Sun, Guolin
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 261
  • [8] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    [J]. NEUROCOMPUTING, 2024, 597
  • [9] Effective and Efficient Spiking Recurrent Neural Networks
    Yin, Bojian
    Corradi, Federico
    Bohte, Sander
    [J]. ERCIM NEWS, 2021, (125): : 9 - 10
  • [10] An online supervised learning method for spiking neural networks with adaptive structure
    Wang, Jinling
    Belatreche, Ammar
    Maguire, Liam
    McGinnity, Thomas Martin
    [J]. NEUROCOMPUTING, 2014, 144 : 526 - 536