GA-YOLO: A Lightweight YOLO Model for Dense and Occluded Grape Target Detection

被引:9
|
作者
Chen, Jiqing [1 ,2 ]
Ma, Aoqiang [1 ]
Huang, Lixiang [1 ]
Su, Yousheng [1 ]
Li, Wenqu [1 ]
Zhang, Hongdu [1 ]
Wang, Zhikui [1 ]
机构
[1] Guangxi Univ, Coll Mechatron Engn, Nanning 530007, Peoples R China
[2] Guangxi Mfg Syst & Adv Mfg Technol Key Lab, Nanning 530007, Peoples R China
关键词
picking robot; computer vision; grape detection; GA-YOLO; dense and occluded target; lightweight model; AUTOMATIC DETECTION; BUNCH DETECTION; FASTER; FRUITS;
D O I
10.3390/horticulturae9040443
中图分类号
S6 [园艺];
学科分类号
0902 ;
摘要
Picking robots have become an important development direction of smart agriculture, and the position detection of fruit is the key to realizing robot picking. However, the existing detection models have the shortcomings of missing detection and slow detection speed when detecting dense and occluded grape targets. Meanwhile, the parameters of the existing model are too large, which makes it difficult to deploy to the mobile terminal. In this paper, a lightweight GA-YOLO model is proposed. Firstly, a new backbone network SE-CSPGhostnet is designed, which greatly reduces the parameters of the model. Secondly, an adaptively spatial feature fusion mechanism is used to address the issues of difficult detection of dense and occluded grapes. Finally, a new loss function is constructed to improve detection efficiency. In 2022, a detection experiment was carried out on the image data collected in the Bagui rural area of Guangxi Zhuang Autonomous Region, the results demonstrate that the GA-YOLO model has an mAP of 96.87%, detection speed of 55.867 FPS and parameters of 11.003 M. In comparison to the model before improvement, the GA-YOLO model has improved mAP by 3.69% and detection speed by 20.245 FPS. Additionally, the GA-YOLO model has reduced parameters by 82.79%. GA-YOLO model not only improves the detection accuracy of dense and occluded targets but also lessens model parameters and accelerates detection speed.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] YOLO-PowerLite: A Lightweight YOLO Model for Transmission Line Abnormal Target Detection
    Liu, Chuanyao
    Wei, Shuangfeng
    Zhong, Shaobo
    Yu, Fan
    [J]. IEEE ACCESS, 2024, 12 : 105004 - 105015
  • [2] YOLO-MTG: a lightweight YOLO model for multi-target garbage detection
    Xia, Zhongyi
    Zhou, Houkui
    Yu, Huimin
    Hu, Haoji
    Zhang, Guangqun
    Hu, Junguo
    He, Tao
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (6-7) : 5121 - 5136
  • [3] YOLO-TS: A Lightweight YOLO Model for Traffic Sign Detection
    Liu, Yunxiang
    Luo, Peng
    [J]. IEEE Access, 2024, 12 : 169013 - 169023
  • [4] MD-YOLO: Multi-scale Dense YOLO for small target pest detection
    Tian, Yunong
    Wang, Shihui
    Li, En
    Yang, Guodong
    Liang, Zize
    Tan, Min
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2023, 213
  • [5] MS-YOLO: A Lightweight and High-Precision YOLO Model for Drowning Detection
    Song, Qi
    Yao, Bodan
    Xue, Yunlong
    Ji, Shude
    [J]. Sensors, 2024, 24 (21)
  • [6] YOLO-ACN: Focusing on Small Target and Occluded Object Detection
    Li, Yongjun
    Li, Shasha
    Du, Haohao
    Chen, Lijia
    Zhang, Dongming
    Li, Yao
    [J]. IEEE ACCESS, 2020, 8 : 227288 - 227303
  • [7] YOLO v7-CS: A YOLO v7-Based Model for Lightweight Bayberry Target Detection Count
    Li, Shuo
    Tao, Tao
    Zhang, Yun
    Li, Mingyang
    Qu, Huiyan
    [J]. AGRONOMY-BASEL, 2023, 13 (12):
  • [8] MGL-YOLO: A Lightweight Barcode Target Detection Algorithm
    Qu, Yuanhao
    Zhang, Fengshou
    [J]. Sensors, 2024, 24 (23)
  • [9] PV-YOLO: Lightweight YOLO for Photovoltaic Panel Fault Detection
    Wang, Yin
    Shen, Lingxin
    Li, Maohuan
    Sun, Qianlai
    Li, Xiaosong
    [J]. IEEE ACCESS, 2023, 11 : 10966 - 10976
  • [10] YOLO-Granada: a lightweight attentioned Yolo for pomegranates fruit detection
    Zhao, Jifei
    Du, Chenfan
    Li, Yi
    Mudhsh, Mohammed
    Guo, Dawei
    Fan, Yuqian
    Wu, Xiaoying
    Wang, Xinfa
    Almodfer, Rolla
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01):