Experimental Study on Intelligent Pushing Method of Feeding Assistant Robot

被引:0
|
作者
Zhang Q. [1 ]
Hu J. [1 ]
Ren H. [1 ]
机构
[1] School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou
关键词
Deep learning; Feeding cows; Intelligent pushing; Robot; Two-dimensional code;
D O I
10.12141/j.issn.1000-565X.210621
中图分类号
学科分类号
摘要
In the feeding of dairy cows in medium and large-scale pastures, the existing pushing robot has a single function, which can only complete the uniform feeding function, and can not meet the personalized feeding needs of each cow. To solve this problem, this paper proposed an intelligent pushing method of feeding assistant robot. Firstly, the two-dimensional code label is introduced as the positioning label of cow neck rail, and the detection frame area of the two-dimensional code label and the cow's head is obtained based on the YOLOv4 deep learning model. Then, the detection frame area of the two-dimensional code label is identified and tracked in real time through the preprocessing and prediction algorithm, and the two-dimensional code label is matched with the cow head to determine the feeding neck rail position of the foraging cow.Finally, based on the cow-rail position mat-ching information and residual material distribution information, the robot push plate is controlled to change the push angle to realize personalized push, so as to meet the individual feeding needs of dairy cows. The experimental results show that the recognition rate of the proposed intelligent pushing method is 96.0%. When the two-dimensional code label loses 60 frames continuously, the tracking and prediction accuracy of the two-dimensional code is within ±2.85%; the processing time of each frame image in the graphics processor is 34.4 ms; the accuracy of intelligent pushing is 100%, which meets the real-time requirement of robot intelligent pushing in complex environment of cowshed. © 2022, Editorial Department, Journal of South China University of Technology. All right reserved.
引用
收藏
页码:111 / 120
页数:9
相关论文
共 28 条
  • [1] CHAPINAL N, VEIRA D M, WEARY D M, Et al., Validation of a system for monitoring individual feeding and drinking behavior and intake in group-housed cattle[J], Journal of Dairy Science, 90, 12, pp. 5732-5736, (2007)
  • [2] RUMBA R, NIKITENKO A., Development of free-flowing pile pushing algorithm for autonomous mobile feed-pushing robots in cattle farms[C], Proceedings of the 17th International Scientific Conference:Enginee-ring for Rural Development, pp. 958-963, (2018)
  • [3] YANG S, LIANG S, ZHENG Y, Et al., Integrated navigation models of a mobile fodder-pushing robot based on a standardized cow husbandry environment[J], Transactions of the ASABE, 63, 2, pp. 221-230, (2020)
  • [4] RIC feed-weight trough
  • [5] BROWN-BRANDL T M, ADRION F, GALLMANN E, Et al., Development and validation of a low-frequency RFID system for monitoring grow-finish pig feeding and drinking behavior, Proceedings of the 10th International Livestock Environment Symposium, pp. 11-19, (2018)
  • [6] ADRION F, KELLER M, BOZZOLINI G B, Et al., Setup, test and validation of a UHF RFID system for monitoring feeding behaviour of dairy cows[J], Sensors, 20, 24, (2020)
  • [7] GAO Zhenjiang, GUO Yuehu, MENG Hewei, Et al., Design of self-propelled precise feeding machine control system for single dairy cow, Transactions of the Chinese Society for Agricultural Machinery, 43, 11, pp. 226-230, (2012)
  • [8] GAO Zhenjiang, LI Hui, MENG Hewei, Study on concentrated precise feeding pattern based on feeding technology of TMR, Transactions of the Chinese Society of Agricultural Engineering, 29, 7, pp. 148-154, (2013)
  • [9] YANG Q, XIAO D, LIN S., Feeding behavior recognition for group-housed pigs with the faster R-CNN[J], Computers and Electronics in Agriculture, 155, pp. 453-460, (2018)
  • [10] ACHOUR B, BELKADI M, FILALI I, Et al., Image analysis for individual identification and feeding beha-viour monitoring of dairy cows based on convolutional neural networks (CNN)[J], Biosystems Enginee-ring, 198, pp. 31-49, (2020)