Enhancing Representation of Spiking Neural Networks via Similarity-Sensitive Contrastive Learning

被引:0
|
作者
Zhang, Yuhan [1 ]
Liu, Xiaode [1 ]
Chen, Yuanpei [1 ]
Peng, Weihang [1 ]
Guo, Yufei [1 ]
Huang, Xuhui [1 ]
Ma, Zhe [1 ]
机构
[1] CASIC, Intelligent Sci & Technol Acad, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) have attracted intensive attention as a promising energy-efficient alternative to conventional artificial neural networks (ANNs) recently, which could transmit information in the form of binary spikes rather than continuous activations thus the multiplication of activation and weight could be replaced by addition to save energy. However, the binary spike representation form will sacrifice the expression performance of SNNs and lead to accuracy degradation compared with ANNs. Considering improving feature representation is beneficial to training an accurate SNN model, this paper focuses on enhancing the feature representation of the SNN. To this end, we establish a similaritysensitive contrastive learning framework, where SNN could capture significantly more information from its ANN counterpart to improve representation by Mutual Information (MI) maximization with layer-wise sensitivity to similarity. In specific, it enriches the SNN's feature representation by pulling the positive pairs of SNN's and ANN's feature representation of each layer from the same input samples closer together while pushing the negative pairs from different samples further apart. Experimental results show that our method consistently outperforms the current state-of-the-art algorithms on both popular non-spiking static and neuromorphic datasets.
引用
收藏
页码:16926 / 16934
页数:9
相关论文
共 50 条
  • [1] Self-Supervised Contrastive Learning In Spiking Neural Networks
    Bahariasl, Yeganeh
    Kheradpisheh, Saeed Reza
    [J]. PROCEEDINGS OF THE 13TH IRANIAN/3RD INTERNATIONAL MACHINE VISION AND IMAGE PROCESSING CONFERENCE, MVIP, 2024, : 181 - 185
  • [2] Molecular representation contrastive learning via transformer embedding to graph neural networks
    Liu, Yunwu
    Zhang, Ruisheng
    Li, Tongfeng
    Jiang, Jing
    Ma, Jun
    Yuan, Yongna
    Wang, Ping
    [J]. APPLIED SOFT COMPUTING, 2024, 164
  • [3] Spiking neural networks for deep learning and knowledge representation: Editorial
    Kasabov, Nikola K.
    [J]. NEURAL NETWORKS, 2019, 119 : 341 - 342
  • [4] Bayesian continual learning via spiking neural networks
    Skatchkovsky, Nicolas
    Jang, Hyeryung
    Simeone, Osvaldo
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2022, 16
  • [5] Enhanced representation learning with temporal coding in sparsely spiking neural networks
    Fois, Adrien
    Girau, Bernard
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2023, 17
  • [6] BrainQN: Enhancing the Robustness of Deep Reinforcement Learning with Spiking Neural Networks
    Feng, Shuo
    Cao, Jian
    Ou, Zehong
    Chen, Guang
    Zhong, Yi
    Wang, Zilin
    Yan, Juntong
    Chen, Jue
    Wang, Bingsen
    Zou, Chenglong
    Feng, Zebang
    Wang, Yuan
    [J]. ADVANCED INTELLIGENT SYSTEMS, 2024, 6 (09)
  • [7] Molecular contrastive learning of representations via graph neural networks
    Yuyang Wang
    Jianren Wang
    Zhonglin Cao
    Amir Barati Farimani
    [J]. Nature Machine Intelligence, 2022, 4 : 279 - 287
  • [8] Molecular contrastive learning of representations via graph neural networks
    Wang, Yuyang
    Wang, Jianren
    Cao, Zhonglin
    Farimani, Amir Barati
    [J]. NATURE MACHINE INTELLIGENCE, 2022, 4 (03) : 279 - 287
  • [9] Contrastive representation learning on dynamic networks
    Jiao, Pengfei
    Chen, Hongjiang
    Tang, Huijun
    Bao, Qing
    Zhang, Long
    Zhao, Zhidong
    Wu, Huaming
    [J]. NEURAL NETWORKS, 2024, 174
  • [10] Neural Graph Similarity Computation with Contrastive Learning
    Hu, Shengze
    Zeng, Weixin
    Zhang, Pengfei
    Tang, Jiuyang
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (15):