Internal defects inspection of arc magnets using multi-head attention-based CNN

被引:8
|
作者
Li, Qiang [1 ]
Huang, Qinyuan [1 ,2 ]
Yang, Tian [1 ]
Zhou, Ying [1 ]
Yang, Kun [1 ]
Song, Hong [2 ]
机构
[1] Sichuan Univ Sci & Engn, Sch Automat & Informat Engn, Zigong 643000, Peoples R China
[2] Artificial Intelligence Key Lab Sichuan Prov, Zigong 643000, Peoples R China
关键词
Convolutional neural network; Multi -head attention; Defect detection; Arc magnets; Classification; CONVOLUTIONAL NEURAL-NETWORK; BEARING FAULT-DIAGNOSIS; LITHIUM-ION BATTERIES; OF-CHARGE ESTIMATION; CLASSIFICATION; TILE;
D O I
10.1016/j.measurement.2022.111808
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Arc magnets are the key components of various motor machinery, and their internal defects detection is extremely significant for maintaining system performance and ensuring operational safety. In this paper, an endto-end improved convolutional neural network (CNN) model based on multi-head attention is presented, where features that play a more important role in defect detection could be efficiently highlighted. In addition, owing to the characteristics of strong parallel working ability in multi-head attention, the training process is greatly accelerated. Meanwhile, to meet the requirements of the model on the amount of data, a data augmentation method is designed accordingly. Then, the performance of the constructed framework is verified in different test scenarios. Experiment results demonstrate that the presented approach owns superior inspection performance based on relatively fewer model parameters compared to other existing methods, even under the small sample, intense noise, and the coexistence of noise and insufficient data.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] MAGRU-IDS: A Multi-Head Attention-Based Gated Recurrent Unit for Intrusion Detection in IIoT Networks
    Ullah, Safi
    Boulila, Wadii
    Koubaa, Anis
    Ahmad, Jawad
    IEEE ACCESS, 2023, 11 : 114590 - 114601
  • [42] Multi-Head Structural Attention-Based Vision Transformer with Sequential Views for 3D Object Recognition
    Bao, Jianjun
    Luo, Ke
    Kou, Qiqi
    He, Liang
    Zhao, Guo
    APPLIED SCIENCES-BASEL, 2025, 15 (06):
  • [43] A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks
    Reza, Selim
    Ferreira, Marta Campos
    Machado, J. J. M.
    Tavares, Joao Manuel R. S.
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [44] Multi-Head Attention based Probabilistic Vehicle Trajectory Prediction
    Kim, Hayoung
    Kim, Dongchan
    Kim, Gihoon
    Cho, Jeongmin
    Huh, Kunsoo
    2020 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2020, : 1720 - 1725
  • [45] Personalized federated learning based on multi-head attention algorithm
    Jiang, Shanshan
    Lu, Meixia
    Hu, Kai
    Wu, Jiasheng
    Li, Yaogen
    Weng, Liguo
    Xia, Min
    Lin, Haifeng
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (11) : 3783 - 3798
  • [46] Feature Fusion and Multi-head Attention Based Hindi Captioner
    Meghwal, Virendra Kumar
    Mittal, Namita
    Singh, Girdhari
    COMPUTER VISION AND IMAGE PROCESSING, CVIP 2023, PT I, 2024, 2009 : 479 - 487
  • [47] A fiber recognition framework based on multi-head attention mechanism
    Xu, Luoli
    Li, Fenying
    Chang, Shan
    TEXTILE RESEARCH JOURNAL, 2024, 94 (23-24) : 2629 - 2640
  • [48] Predicting disease genes based on multi-head attention fusion
    Linlin Zhang
    Dianrong Lu
    Xuehua Bi
    Kai Zhao
    Guanglei Yu
    Na Quan
    BMC Bioinformatics, 24
  • [49] Personalized federated learning based on multi-head attention algorithm
    Shanshan Jiang
    Meixia Lu
    Kai Hu
    Jiasheng Wu
    Yaogen Li
    Liguo Weng
    Min Xia
    Haifeng Lin
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3783 - 3798
  • [50] Predicting disease genes based on multi-head attention fusion
    Zhang, Linlin
    Lu, Dianrong
    Bi, Xuehua
    Zhao, Kai
    Yu, Guanglei
    Quan, Na
    BMC BIOINFORMATICS, 2023, 24 (01)