Research on Contactless Detection of Sow Backfat Thickness Based on Segmented Images with Feature Visualization

被引:1
|
作者
Cao, Tingjin [1 ,2 ]
Li, Xuan [1 ,2 ,3 ,4 ]
Liu, Xiaolei [5 ]
Liang, Hao [1 ,2 ]
Wang, Haiyan [2 ,6 ]
Xu, Dihong [1 ,2 ]
机构
[1] Huazhong Agr Univ, Coll Engn & Technol, Wuhan 430070, Peoples R China
[2] Huazhong Agr Univ, Shenzhen Inst Nutr & Hlth, Shenzhen 518000, Peoples R China
[3] Chinese Acad Agr Sci, Agr Genom Inst Shenzhen, Shenzhen 518000, Peoples R China
[4] Guangdong Lab Lingnan Modern Agr, Shenzhen Branch, Shenzhen 518000, Peoples R China
[5] Hubei Hongshan Lab, Wuhan 430070, Peoples R China
[6] Minist Agr & Rural Affairs, Key Lab Smart Farming Agr Anim, Wuhan 430070, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 02期
关键词
backfat thickness; non-contact detection; feature visualization; residual network; TRACKING; PIGS;
D O I
10.3390/app14020752
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Aiming to address the problem that the existing methods for detecting sow backfat thickness are stressful, costly, and cannot detect in real time, this paper proposes a non-contact detection method for sow backfat with a residual network based on image segmentation using the feature visualization of neural networks. In this paper, removing the irrelevant information of the image to improve the accuracy of the sow backfat thickness detection model is proposed. The irrelevant features in the corresponding image of the feature map are found to have the same high brightness as the relevant feature regions using feature visualization. An image segmentation algorithm is then used to separate the relevant feature image regions, and the model performance before and after image segmentation is compared to verify the feasibility of this method. In order to verify the generalization ability of the model, five datasets were randomly divided, and the test results show that the coefficients of determination (R2) of the five groups were above 0.89, with a mean value of 0.91, and the mean absolute error (MAE) values were below 0.66 mm, with a mean value of 0.54 mm, indicating that the model has high detection accuracy and strong robustness. In order to explain the high accuracy of the backfat thickness detection model and to increase the credibility of the application of the detection model, using feature visualization, the irrelevant features and related features of the sow back images extracted by the residual network were statistically analyzed, which were the characteristics of the hip edge, the area near the body height point, the area near the backfat thickness measurement point (P2), and the lateral contour edge. The first three points align with the previous research on sow backfat, thus explaining the phenomenon of the high accuracy of the detection model. At the same time, the side contour edge features were found to be effective for predicting the thickness of the back. In order to explore the influence of irrelevant features on the accuracy of the model, UNet was used to segment the image area corresponding to the irrelevant features and obtain the sow contour image, which was used to construct a dorsal fat thickness detection model. The R2 results of the model were above 0.91, with a mean value of 0.94, and the MAE was below 0.65 mm, with a mean value of 0.44 mm. Compared to the test results of the model before segmentation, the average R2 of the model after segmentation increased by 3.3%, and the average MAE decreased by 18.5%, indicating that irrelevant features will reduce the detection accuracy of the model, which can provide a reference for farmers to dynamically monitor the backfat of sows and accurately manage their farms.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Estimation of Sow Backfat Thickness Based on Machine Vision
    Jian, Yue
    Pu, Shihua
    Zhu, Jiaming
    Zhang, Jianlong
    Xing, Wenwen
    ANIMALS, 2024, 14 (23):
  • [2] Non-contact detection method of pregnant sows backfat thickness based on two-dimensional images
    Yu, Mengyuan
    Zheng, Hongya
    Xu, Dihong
    Shuai, Yonghui
    Tian, Shanfeng
    Cao, Tingjin
    Zhou, Mingyan
    Zhu, Yuhua
    Zhao, Shuhong
    Li, Xuan
    ANIMAL GENETICS, 2022, 53 (06) : 769 - 781
  • [3] Research on vehicle detection based on background feature analysis in SAR images
    Zhang, Bochuan
    Tang, Bo
    Zhang, Cong
    Hu, Ruiguang
    Yun, Hongquan
    Xiao, Liping
    AOPC 2017: SPACE OPTICS AND EARTH IMAGING AND SPACE NAVIGATION, 2017, 10463
  • [4] CHANGE DETECTION IN SEGMENTED IMAGES BASED ON REFERENCE FILTERS
    Lebedev, M. A.
    Rubis, A. Yu.
    Gorbatsevich, V. S.
    Vizilter, Yu. V.
    Vygolov, O. V.
    COMPUTER OPTICS, 2016, 40 (01) : 80 - 86
  • [5] Ship Detection and Feature Visualization Analysis Based on Lightweight CNN in VH and VV Polarization Images
    Geng, Xiaomeng
    Shi, Lei
    Yang, Jie
    Li, Pingxiang
    Zhao, Lingli
    Sun, Weidong
    Zhao, Jinqi
    REMOTE SENSING, 2021, 13 (06)
  • [6] Research on a Feature Point Detection Algorithm for Weld Images Based on Deep Learning
    Kang, Shaopeng
    Qiang, Hongbin
    Yang, Jing
    Liu, Kailei
    Qian, Wenbin
    Li, Wenpeng
    Pan, Yanfei
    ELECTRONICS, 2024, 13 (20)
  • [7] Tensor based feature detection for color images
    van de Weijer, J
    Gevers, T
    12TH COLOR IMAGING CONFERENCE: COLOR SCIENCE AND ENGINEERING SYSTEMS, TECHNOLOGIES, APPLICATIONS, 2004, : 100 - 105
  • [8] Research on Tourist Map Visualization Based on Color Images
    Liang, Dongdong
    Wu, Xu
    2010 18TH INTERNATIONAL CONFERENCE ON GEOINFORMATICS, 2010,
  • [9] Feature Selection Analysis of Chewing Activity Based on Contactless Food Intake Detection
    Selamat, Nur Asmiza
    Ali, Sawal Hamid Md
    Minhad, Khairun Nisa'
    Sampe, Jahariah
    INTERNATIONAL JOURNAL OF INTEGRATED ENGINEERING, 2021, 13 (05): : 38 - 48
  • [10] RESEARCH ON OBJECT DETECTION IN NEAR-INFRARED REMOTE SENSING IMAGES BASED ON FEATURE TRANSFER
    Song, Yiyun
    Luo, Xin
    Chen, Yanyang
    Adugna, Tesfaye
    Wei, Xufeng
    2024 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2024), 2024, : 7752 - 7755