Spiking ViT: spiking neural networks with transformer-attention for steel surface defect classification

被引:2
|
作者
Gong, Liang [1 ]
Dong, Hang [1 ]
Zhang, Xinyu [1 ]
Cheng, Xin [2 ]
Ye, Fan [3 ]
Guo, Liangchao [1 ]
Ge, Zhenghui [1 ]
机构
[1] Yangzhou Univ, Sch Mech Engn, Yangzhou, Jiangsu, Peoples R China
[2] Yangzhou Univ, Sch Informat Engn, Sch Artificial Intelligence, Yangzhou, Jiangsu, Peoples R China
[3] Tongling Nonferrous Met Grp Co Ltd, Tongling, Peoples R China
基金
中国博士后科学基金;
关键词
surface defect classification; vision transformer; spiking neural network; deep learning; data modeling;
D O I
10.1117/1.JEI.33.3.033001
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
. Throughout the steel production process, a variety of surface defects inevitably occur. These defects can impair the quality of steel products and reduce manufacturing efficiency. Therefore, it is crucial to study and categorize the multiple defects on the surface of steel strips. Vision transformer (ViT) is a unique neural network model based on a self-attention mechanism that is widely used in many different disciplines. Conventional ViT ignores the specifics of brain signaling and instead uses activation functions to simulate genuine neurons. One of the fundamental building blocks of a spiking neural network is leaky integration and fire (LIF), which has biodynamic characteristics akin to those of a genuine neuron. LIF neurons work in an event-driven manner such that higher performance can be achieved with less power. The goal of this work is to integrate ViT and LIF neurons to build and train an end-to-end hybrid network architecture, spiking vision transformer (S-ViT), for the classification of steel surface defects. The framework relies on the ViT architecture by replacing the activation functions used in ViT with LIF neurons, constructing a global spike feature fusion module spiking transformer encoder as well as a spiking-MLP classification head for implementing the classification functionality and using it as a basic building block of S-ViT. Based on the experimental results, our method has demonstrated outstanding classification performance across all metrics. The overall test accuracies of S-ViT are 99.41%, 99.65%, 99.54%, and 99.77% on NEU-CLSs, and 95.70%, 95.93%, 96.94%, and 97.19% on XSDD. S-ViT achieves superior classification performance compared to convolutional neural networks and recent findings. Its performance is also improved relative to the original ViT model. Furthermore, the robustness test results of S-ViT show that S-ViT still maintains reliable accuracy when recognizing images that contain Gaussian noise.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Modeling spiking neural networks
    Zaharakis, Ioannis D.
    Kameas, Achilles D.
    THEORETICAL COMPUTER SCIENCE, 2008, 395 (01) : 57 - 76
  • [42] Evolutionary Spiking Neural Networks for Solving Supervised Classification Problems
    Lopez-Vazquez, G.
    Ornelas-Rodriguez, M.
    Espinal, A.
    Soria-Alcaraz, J. A.
    Rojas-Dominguez, A.
    Puga-Soberanes, H. J.
    Carpio, J. M.
    Rostro-Gonzalez, H.
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2019, 2019
  • [43] Heartbeat Classification with Spiking Neural Networks on the Loihi Neuromorphic Processor
    Buettner, Kyle
    George, Alan D.
    2021 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2021), 2021, : 138 - 143
  • [44] Enhancing spiking neural networks with hybrid top-down attention
    Liu, Faqiang
    Zhao, Rong
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [45] A Spatial-Channel-Temporal-Fused Attention for Spiking Neural Networks
    Cai, Wuque
    Sun, Hongze
    Liu, Rui
    Cui, Yan
    Wang, Jun
    Xia, Yang
    Yao, Dezhong
    Guo, Daqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 14315 - 14329
  • [46] Exploiting High Performance Spiking Neural Networks With Efficient Spiking Patterns
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025, 9 (02): : 1480 - 1489
  • [47] The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks
    Cramer, Benjamin
    Stradmann, Yannik
    Schemmel, Johannes
    Zenke, Friedemann
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 2744 - 2757
  • [48] Spiking-UNet: Spiking Neural Networks for Spectrum Occupancy Monitoring
    Dakic, Kosta
    Al Homssi, Bassel
    Al-Hourani, Akram
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [49] SSF: Accelerating Training of Spiking Neural Networks with Stabilized Spiking Flow
    Wang, Jingtao
    Song, Zengjie
    Wang, Yuxi
    Xiao, Jun
    Yang, Yuran
    Mei, Shuqi
    Zhang, Zhaoxiang
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 5959 - 5968
  • [50] Photonic Spiking Neural Networks and Graphene-on-Silicon Spiking Neurons
    Jha, Aashu
    Huang, Chaoran
    Peng, Hsuan-Tung
    Shastri, Bhavin
    Prucnal, Paul R.
    JOURNAL OF LIGHTWAVE TECHNOLOGY, 2022, 40 (09) : 2901 - 2914