Sunflower seeds classification based on self-attention Focusing algorithm

被引:4
|
作者
Jin, Xiaowei [1 ]
Zhao, Yuhong [1 ]
Bian, Haodong [2 ]
Li, Jianjun [1 ]
Xu, Chuanshuai [1 ]
机构
[1] Inner Mongolia Univ Sci & Technol, Sch Informat Engn, Baotou 014010, Peoples R China
[2] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110819, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; Image recognition; Self-attention Focusing algorithm; Model lightness; Vision Transformer;
D O I
10.1007/s11694-022-01612-x
中图分类号
TS2 [食品工业];
学科分类号
0832 ;
摘要
In order to effectively achieve the quality classification of sunflower seeds based on appearance identification in production scenarios, a multi-objective sunflower seed classification method based on a self-attention mechanism was proposed in this paper, which achieves the classification of sunflower seed object images with an accuracy of 94.71% by using a Multi-Head Self-attention mechanism to focus on the classification objects on the basis of effectively extracting context-dependent information from sunflower seed images. However, visualization experiments found that the model's attention could not effectively focus on the class activated regions of the image, resulting in a model that was heavily influenced by background noise and suffered from such defects as high computational cost and large model size. To address the above problems, a self-attention Focusing algorithm was proposed in this paper by evaluating and reconstructing the connections in the attention weight matrix, pruning the less influential connections, and realizing the update of the attention score, which gradually focuses attention on the class activated regions in continuous iterations, effectively shielding the background noise. Experiments have shown that applying the Focusing algorithm improves the accuracy rate to 96.6% based on a 34.3% reduction in the number of model parameters. Model accuracy was successfully improved while achieving model lightweighting, and the model was made more resistant to background noise due to the focused attention region. The self-attention Focusing algorithm's effectiveness in improving accuracy, focusing attention, and compressing model size is demonstrated. In practical production scenarios, it can be used as a popular application of key techniques for sunflower seed classification.
引用
收藏
页码:143 / 154
页数:12
相关论文
共 50 条
  • [41] Conditional self-attention generative adversarial network with differential evolution algorithm for imbalanced data classification
    Jiawei NIU
    Zhunga LIU
    Quan PAN
    Yanbo YANG
    Yang LI
    Chinese Journal of Aeronautics , 2023, (03) : 303 - 315
  • [42] Point Cloud Segmentation Algorithm Based on Density Awareness and Self-Attention Mechanism
    Lu Bin
    Liu Yawei
    Zhang Yuhang
    Yang Zhenyu
    LASER & OPTOELECTRONICS PROGRESS, 2024, 61 (08)
  • [43] Dual-branch crowd counting algorithm based on self-attention mechanism
    Yang T.-L.
    Li L.-X.
    Zhang W.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2023, 57 (10): : 1955 - 1965
  • [44] Conditional self-attention generative adversarial network with differential evolution algorithm for imbalanced data classification
    Jiawei NIU
    Zhunga LIU
    Quan PAN
    Yanbo YANG
    Yang LI
    Chinese Journal of Aeronautics, 2023, 36 (03) : 303 - 315
  • [45] Research on Modulation Recognition Algorithm Based on Channel and Spatial Self-Attention Mechanism
    Zhang, Wenna
    Sun, Yunqiang
    Xue, Kailiang
    Yao, Aiqin
    IEEE ACCESS, 2023, 11 : 68617 - 68631
  • [46] Image Deblurring Algorithm Incorporating Self-Attention Mechanism
    Yu, Tingting
    Lv, Qiang
    Huang, Zhen
    Su, Zhang
    Wang, Xiangli
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2025,
  • [47] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [48] Rapid nuclide identification algorithm based on self-attention mechanism neural network
    Sun, Jiaqian
    Niu, Deqing
    Liang, Jie
    Hou, Xin
    Li, Linshan
    ANNALS OF NUCLEAR ENERGY, 2024, 207
  • [49] Self-attention and forgetting fusion knowledge tracking algorithm
    Song, Jianfeng
    Wang, Yukai
    Zhang, Chu
    Xie, Kun
    INFORMATION SCIENCES, 2024, 680
  • [50] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9