Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future

被引:0
|
作者
Dong Y. [1 ]
Zhao D. [2 ]
Zeng Y. [2 ]
机构
[1] School of Future Technology, University of Chinese Academy of Sciences, Beijing
[2] Brain-Inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences, Beijing
来源
关键词
Biological neural networks; Computational modeling; Mathematical models; Membrane potentials; Neuromorphic engineering; Neurons; Spiking Neural Network; Temporal Information; Testing; Training;
D O I
10.1109/TAI.2024.3374268
中图分类号
学科分类号
摘要
Spiking Neural Networks (SNNs) have attracted significant attention from researchers across various domains due to their brain-inspired information processing mechanism. However, SNNs typically grapple with challenges such as extended time steps, low temporal information utilization, and the requirement for consistent time step between testing and training. These challenges render SNNs with high latency. Moreover, the constraint on time steps necessitates the retraining of the model for new deployments, reducing adaptability. To address these issues, this paper proposed a novel perspective, viewing the SNN as a temporal aggregation model. We introduced the Temporal Knowledge Sharing (TKS) method, facilitating information interact between different time points. TKS can be perceived as a form of temporal self-distillation. To validate the efficacy of TKS in information processing, we tested it on static datasets like CIFAR10, CIFAR100, ImageNet-1k, and neuromorphic datasets such as DVS-CIFAR10 and NCALTECH101. Experimental results demonstrated that our method achieves state-of-the-art performance compared to other algorithms. Furthermore, TKS addresses the temporal consistency challenge, endowing the model with superior temporal generalization capabilities. This allows the network to train with longer time steps and maintain high performance during testing with shorter time steps. Such an approach considerably accelerates the deployment of SNNs on edge devices. Finally, we conducted ablation experiments and tested TKS on fine-grained tasks, with results showcasing TKS’s enhanced capability to process information efficiently. IEEE
引用
收藏
页码:1 / 10
页数:9
相关论文
共 50 条
  • [41] Neural Basis of Second Language Speech Learning - Past and Future: A Commentary on "The Neurocognitive Underpinnings of Second Language Processing: Knowledge Gains From the Past and Future Outlook"
    Wong, Patrick C. M.
    LANGUAGE LEARNING, 2023, 73 : 139 - 142
  • [42] A Fast Learning Algorithm of Self-Learning Spiking Neural Network
    Bodyanskiy, Yevgeniy
    Dolotov, Artem
    Pliss, Iryna
    Malyar, Mykola
    PROCEEDINGS OF THE 2016 IEEE FIRST INTERNATIONAL CONFERENCE ON DATA STREAM MINING & PROCESSING (DSMP), 2016, : 104 - 107
  • [43] Learning the Dynamics of Future Marine Microgrids Using Temporal Convolutional Neural Network
    Ge, Xiaoyu
    Hosseinipour, Ali
    Putri, Saskia
    Moazeni, Faegheh
    Khazaei, Javad
    IEEE CONFERENCE ON EVOLVING AND ADAPTIVE INTELLIGENT SYSTEMS 2024, IEEE EAIS 2024, 2024, : 94 - 100
  • [44] Directly training temporal Spiking Neural Network with sparse surrogate gradient
    Li, Yang
    Zhao, Feifei
    Zhao, Dongcheng
    Zeng, Yi
    NEURAL NETWORKS, 2024, 179
  • [45] Heterogeneous recurrent spiking neural network for spatio-temporal classification
    Chakraborty, Biswadeep
    Mukhopadhyay, Saibal
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [46] An Efficient Discrete Model for Implementing Temporal Coding Spiking Neural Network
    Charles, E. Y. Andrew
    14TH INTERNATIONAL CONFERENCE ON ADVANCES IN ICT FOR EMERGING REGIONS (ICTER) 2014, 2014, : 74 - 77
  • [47] Efficient Convolutional Processing of Spiking Neural Network With Weight-Sharing Filters
    Song, Seunghwan
    Jeon, Bosung
    Kim, Munhyeon
    Kim, Jae-Joon
    IEEE ELECTRON DEVICE LETTERS, 2023, 44 (06) : 1007 - 1010
  • [48] A Spiking Neural Network Model for Associative Memory Using Temporal Codes
    Hu, Jun
    Tang, Huajin
    Tan, Kay Chen
    Gee, Sen Bong
    PROCEEDINGS OF THE 18TH ASIA PACIFIC SYMPOSIUM ON INTELLIGENT AND EVOLUTIONARY SYSTEMS, VOL 1, 2015, : 561 - 572
  • [49] A SPIKING NEURAL NETWORK WITH LOCAL LEARNING RULES DERIVED FROM NONNEGATIVE SIMILARITY MATCHING
    Pehlevan, Cengiz
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7958 - 7962
  • [50] Knowledge Sharing for Population Based Neural Network Training
    Oehmcke, Stefan
    Kramer, Oliver
    KI 2018: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, 11117 : 258 - 269