VTSNN: a virtual temporal spiking neural network

被引:4
|
作者
Qiu, Xue-Rui [1 ]
Wang, Zhao-Rui [1 ]
Luan, Zheng [1 ]
Zhu, Rui-Jie [2 ]
Wu, Xiao [3 ]
Zhang, Ma-Lu [4 ]
Deng, Liang-Jian [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Optoelect Sci & Engn, Chengdu, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Publ Affairs & Adm, Chengdu, Peoples R China
[3] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu, Peoples R China
[4] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu, Peoples R China
关键词
spiking neural networks; undistorted weighted-encoding; decoding; neuromorphic circuits; Independent-Temporal Backpropagation; biologically-inspired artificial intelligence; BACKPROPAGATION;
D O I
10.3389/fnins.2023.1091097
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Spiking neural networks (SNNs) have recently demonstrated outstanding performance in a variety of high-level tasks, such as image classification. However, advancements in the field of low-level assignments, such as image reconstruction, are rare. This may be due to the lack of promising image encoding techniques and corresponding neuromorphic devices designed specifically for SNN-based low-level vision problems. This paper begins by proposing a simple yet effective undistorted weighted-encoding-decoding technique, which primarily consists of an Undistorted Weighted-Encoding (UWE) and an Undistorted Weighted-Decoding (UWD). The former aims to convert a gray image into spike sequences for effective SNN learning, while the latter converts spike sequences back into images. Then, we design a new SNN training strategy, known as Independent-Temporal Backpropagation (ITBP) to avoid complex loss propagation in spatial and temporal dimensions, and experiments show that ITBP is superior to Spatio-Temporal Backpropagation (STBP). Finally, a so-called Virtual Temporal SNN (VTSNN) is formulated by incorporating the above-mentioned approaches into U-net network architecture, fully utilizing the potent multiscale representation capability. Experimental results on several commonly used datasets such as MNIST, F-MNIST, and CIFAR10 demonstrate that the proposed method produces competitive noise-removal performance extremely which is superior to the existing work. Compared to ANN with the same architecture, VTSNN has a greater chance of achieving superiority while consuming similar to 1/274 of the energy. Specifically, using the given encoding-decoding strategy, a simple neuromorphic circuit could be easily constructed to maximize this low-carbon strategy.
引用
下载
收藏
页数:14
相关论文
共 50 条
  • [21] Bifurcation spiking neural network
    Zhang, Shao-Qun
    Zhang, Zhao-Yu
    Zhou, Zhi-Hua
    Journal of Machine Learning Research, 2021, 22
  • [22] Optoelectronic Spiking Neural Network
    Kozemiako, V. P.
    Kolesnytskyj, O. K.
    Lischenko, T. S.
    Wojcik, W.
    Sulemenov, A.
    OPTICAL FIBERS AND THEIR APPLICATIONS 2012, 2013, 8698
  • [23] Bifurcation Spiking Neural Network
    Zhang, Shao-Qun
    Zhang, Zhao-Yu
    Zhou, Zhi-Hua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22 : 1 - 21
  • [24] A spiking recurrent neural network
    Li, Y
    Harris, JG
    VLSI 2004: IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI, PROCEEDINGS, 2004, : 321 - 322
  • [25] A regenerating spiking neural network
    Federici, D
    NEURAL NETWORKS, 2005, 18 (5-6) : 746 - 754
  • [26] Spiking Neural Network Architecture
    Montuschi, Paolo
    COMPUTER, 2015, 48 (10) : 6 - 6
  • [27] A biomimetic neural encoder for spiking neural network
    Shiva Subbulakshmi Radhakrishnan
    Amritanand Sebastian
    Aaryan Oberoi
    Sarbashis Das
    Saptarshi Das
    Nature Communications, 12
  • [28] A biomimetic neural encoder for spiking neural network
    Radhakrishnan, Shiva Subbulakshmi
    Sebastian, Amritanand
    Oberoi, Aaryan
    Das, Sarbashis
    Das, Saptarshi
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [29] Activeness: A Novel Neural Coding Scheme Integrating the Spike Rate and Temporal Information in the Spiking Neural Network
    Wang, Zongxia
    Yu, Naigong
    Liao, Yishen
    ELECTRONICS, 2023, 12 (19)
  • [30] SATO: Spiking Neural Network Acceleration via Temporal-Oriented Dataflow and Architecture
    Liu, Fangxin
    Zhao, Wenbo
    Wang, Zongwu
    Chen, Yongbiao
    Yang, Tao
    He, Zhezhi
    Yang, Xiaokang
    Jiang, Li
    PROCEEDINGS OF THE 59TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC 2022, 2022, : 1105 - 1110