Sunflower seeds classification based on self-attention Focusing algorithm

被引:4
|
作者
Jin, Xiaowei [1 ]
Zhao, Yuhong [1 ]
Bian, Haodong [2 ]
Li, Jianjun [1 ]
Xu, Chuanshuai [1 ]
机构
[1] Inner Mongolia Univ Sci & Technol, Sch Informat Engn, Baotou 014010, Peoples R China
[2] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110819, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; Image recognition; Self-attention Focusing algorithm; Model lightness; Vision Transformer;
D O I
10.1007/s11694-022-01612-x
中图分类号
TS2 [食品工业];
学科分类号
0832 ;
摘要
In order to effectively achieve the quality classification of sunflower seeds based on appearance identification in production scenarios, a multi-objective sunflower seed classification method based on a self-attention mechanism was proposed in this paper, which achieves the classification of sunflower seed object images with an accuracy of 94.71% by using a Multi-Head Self-attention mechanism to focus on the classification objects on the basis of effectively extracting context-dependent information from sunflower seed images. However, visualization experiments found that the model's attention could not effectively focus on the class activated regions of the image, resulting in a model that was heavily influenced by background noise and suffered from such defects as high computational cost and large model size. To address the above problems, a self-attention Focusing algorithm was proposed in this paper by evaluating and reconstructing the connections in the attention weight matrix, pruning the less influential connections, and realizing the update of the attention score, which gradually focuses attention on the class activated regions in continuous iterations, effectively shielding the background noise. Experiments have shown that applying the Focusing algorithm improves the accuracy rate to 96.6% based on a 34.3% reduction in the number of model parameters. Model accuracy was successfully improved while achieving model lightweighting, and the model was made more resistant to background noise due to the focused attention region. The self-attention Focusing algorithm's effectiveness in improving accuracy, focusing attention, and compressing model size is demonstrated. In practical production scenarios, it can be used as a popular application of key techniques for sunflower seed classification.
引用
收藏
页码:143 / 154
页数:12
相关论文
共 50 条
  • [31] Lightweight Self-Attention Residual Network for Hyperspectral Classification
    Xia, Jinbiao
    Cui, Ying
    Li, Wenshan
    Wang, Liguo
    Wang, Chao
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [32] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [33] Question classification task based on deep learning models with self-attention mechanism
    Mondal S.
    Barman M.
    Nag A.
    Multimedia Tools and Applications, 2025, 84 (10) : 7777 - 7806
  • [34] Sentiment Classification Method Based on Multi-channel Features and Self-attention
    Li, Wei-Jiang
    Qi, Fang
    Yu, Zheng-Tao
    Ruan Jian Xue Bao/Journal of Software, 2021, 32 (09): : 2783 - 2800
  • [35] Self-Attention Based Video Summarization
    Li Y.
    Wang J.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2020, 32 (04): : 652 - 659
  • [36] Malware Classification on Imbalanced Data through Self-Attention
    Ding, Yu
    Wang, ShuPeng
    Xing, Jian
    Zhang, XiaoYu
    Qi, ZiSen
    Fu, Ge
    Qiang, Qian
    Sun, HaoLiang
    Zhang, JianYu
    2020 IEEE 19TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2020), 2020, : 154 - 161
  • [37] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [38] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [39] Point Cloud Classification Segmentation Model Based on Self-Attention and Edge Convolution
    Shen, Lu
    Yang, Jiazhi
    Zhou, Guoqing
    Huo, Jiaxin
    Chen, Mengqiang
    Yu, Guangwang
    Zhang, Yuyang
    Computer Engineering and Applications, 2023, 59 (19) : 106 - 113
  • [40] Conditional self-attention generative adversarial network with differential evolution algorithm for imbalanced data classification
    Niu, Jiawei
    Liu, Zhunga
    Pan, Quan
    Yang, Yanbo
    LI, Yang
    CHINESE JOURNAL OF AERONAUTICS, 2023, 36 (03) : 303 - 315