On the Intrinsic Structures of Spiking Neural Networks

被引:0
|
作者
Zhang, Shao-Qun [1 ,2 ]
Chen, Jia-Yi [1 ,2 ]
Wu, Jin-Hui [1 ,3 ]
Zhang, Gao [4 ]
Xiong, Huan [5 ]
Gu, Bin [5 ]
Zhou, Zhi-Hua [1 ,3 ]
机构
[1] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Nanjing Univ, Sch Intelligent Sci & Technol, Nanjing, Peoples R China
[3] Nanjing Univ, Sch Artificial Intelligence, Nanjing, Peoples R China
[4] Jiangsu Second Normal Univ, Sch Math Sci, Nanjing, Peoples R China
[5] Mohamed bin Zayed Univ Artificial Intelligence, Dept Machine Learning, Abu Dhabi, U Arab Emirates
基金
美国国家科学基金会;
关键词
Spiking Neural Network; Intrinsic Structures; Integration Operation; Self-connection Architecture; Firing-Reset Mechanism; Stochastic Excitation; Rademacher Complexity; COMPUTATIONAL POWER; NEURONS; NOISE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent years have emerged a surge of interest in spiking neural networks (SNNs). The performance of SNNs hinges not only on searching apposite architectures and connection weights, similar to conventional artificial neural networks, but also on the meticulous configuration of their intrinsic structures. However, there has been a dearth of comprehensive studies examining the impact of intrinsic structures; thus developers often feel challenging to apply a standardized configuration of SNNs across diverse datasets or tasks. This work delves deep into the intrinsic structures of SNNs. Initially, we draw two key conclusions: (1) the membrane time hyper -parameter is intimately linked to the eigenvalues of the integration operation, dictating the functional topology of spiking dynamics; (2) various hyper -parameters of the firing -reset mechanism govern the overall firing capacity of an SNN, mitigating the injection ratio or sampling density of input data. These findings elucidate why the efficacy of SNNs hinges heavily on the configuration of intrinsic structures and lead to a recommendation that enhancing the adaptability of these structures contributes to improving the overall performance and applicability of SNNs. Inspired by this recognition, we propose two feasible approaches to enhance SNN learning, involving developing self -connection architectures and stochastic spiking neurons to augment the adaptability of the integration operation and firing -reset mechanism, respectively. We theoretically prove that (1) both methods promote the expressive property for universal approximation, (2) the incorporation of self -connection architectures fosters ample solutions and structural stability for SNNs approximating adaptive dynamical systems, (3) the stochastic spiking neurons maintain generalization bounds with an exponential reduction in Rademacher complexity. Empirical experiments conducted on various real -world datasets affirm the effectiveness of our proposed methods.
引用
收藏
页数:74
相关论文
共 50 条
  • [41] Spiking Neural Networks and Their Applications: A Review
    Yamazaki, Kashu
    Vo-Ho, Viet-Khoa
    Bulsara, Darshan
    Le, Ngan
    BRAIN SCIENCES, 2022, 12 (07)
  • [42] Inherent Redundancy in Spiking Neural Networks
    Yao, Man
    Hu, Jiakui
    Zhao, Guangshe
    Wang, Yaoyuan
    Zhang, Ziyang
    Xu, Bo
    Li, Guoqi
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 16878 - 16888
  • [43] EVOLUTIONARY DESIGN OF SPIKING NEURAL NETWORKS
    Belatreche, Ammar
    Maguire, Liam P.
    Mcginnity, Martin
    Wu, Qing Xiang
    NEW MATHEMATICS AND NATURAL COMPUTATION, 2006, 2 (03) : 237 - 253
  • [44] Sound localization with spiking neural networks
    Dan Goodman
    Daniel Pressnitzer
    Romain Brette
    BMC Neuroscience, 10 (Suppl 1)
  • [45] Spiking neural networks take control
    DeWolf, Travis
    SCIENCE ROBOTICS, 2021, 6 (58)
  • [46] Fast simulation of spiking neural networks
    Fang, Hui-Juan
    Luo, Ji-Liang
    Huang, Cai-Hong
    Li, Ping
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2011, 41 (SUPPL. 1): : 227 - 230
  • [47] Gradient Descent for Spiking Neural Networks
    Huh, Dongsung
    Sejnowski, Terrence J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [48] Stochasticity and robustness in spiking neural networks
    Olin-Ammentorp, Wilkie
    Beckmann, Karsten
    Schuman, Catherine D.
    Plank, James S.
    Cady, Nathaniel C.
    NEUROCOMPUTING, 2021, 419 : 23 - 36
  • [49] STORING INFORMATION WITH SPIKING NEURAL NETWORKS
    Mirsu, Radu
    Tiponut, Virgil
    Gavrilut, Ioan
    PROCEEDINGS OF THE 13TH WSEAS INTERNATIONAL CONFERENCE ON COMPUTERS, 2009, : 318 - +
  • [50] Design of Router for Spiking Neural Networks
    Ni, Yewen
    Cui, Xiaoxin
    Fan, Yuanning
    Han, Qiankun
    Liu, Kefei
    Cui, Xiaole
    2017 IEEE 12TH INTERNATIONAL CONFERENCE ON ASIC (ASICON), 2017, : 965 - 968