Design and Experiment of Intelligent Feed-pushing Robot Based on Information Fusion

ZHANG Qin, REN Hailin, HU Jiahui

Abstract

The regular pushing of feed is an essential part of feeding process of dairy cows. Aiming at the problem that the existing feed pushing robots have single function, which cannot collect and transport the feed according to the cows' position to meet their needs, an intelligent feed pushing robot for dairy cows was developed. Firstly, YOLACT instance segmentation model was used to identify cows, feed, and square rod and obtain the mask. Secondly, dynamic objects were removed at ORB-SLAM3 by using the mask to improve the positioning accuracy, and then the real-time robot position was obtained. Thirdly, the location of foraging cows was calculated by combining the mask, stereo camera depth image and the robot position, and the distance between the robot and the cattle barn was calculated by using mask and depth image with the square rod as reference. Finally, during the working process of the robot, the distance between the robot and the cattle barn was kept unchanged, and the independent decisions were made by the robot according to the foraging cows position and the feeding time, so as to realize the multi-mode feeding functions of push, collect-transport and clean, so it can improve the feed utilization efficiency and meet the free feeding needs of cows. The research and experimental results showed that on the TUM RGB-D dataset, compared with ORB-SLAM3, the proposed algorithm can effectively reduce the positioning error in dynamic environments; the foraging cows position calculation accuracy was ±0.1m, and each cow can be recognized; the distance calculation accuracy between the robot and the cattle barn was ±0.8cm; the working mode selection accuracy was 100%; and the algorithm running rate was 12f/s. The robot met the requirements of intelligent feeding of robots in complex environments.


Keywords: cows, intelligent feed-pushing, robot, instance segmentation, dynamic object elimination, visual SLAM

 

Download Full Text:

PDF


References


MILLER-CUSH ON E K, DEVRIES T J. Feed sorting in dairy cattle; causes, consequences, and management [ J ] . Journal of Dairy Science, 2017,100(5) : 4172 -4183.

ALBRIGHT J L. Feeding behavior of dairy cattle[J]. Journal of Dairy Science, 1993,76(2) ; 485 -498.

NABOKOV V I. Applications of feed pusher robots on cattle farmings and its economic efficiency [ J ]. International Transaction Journal of Engineering Management & Applied Sciences & Technologies, 2020,14( 11); 14 -20.

JIAO Pande, HE Chengzhu, YANG Junping. Development and manufacture of intelligent push feed robot for cows[J]. Journal of Chinese Agricultural Mechanization, 2018,39( 1 ) :74 -77. ( in Chinese)

YUAN Yuhao, TIAN Yuhui, LI Ao, et al. The design of scattering feed and pushing grass robot[J]. Mechanical & Electrical Engineering Technology, 2020,49( 12) ; 101 - 103. (in Chinese)

STANKKOVSKI S, OSTOJIC G, SENK I, et al. Dairy cow monitoring by KFID[ J ]. Scientia Agricola, 2012, 69 : 75 -80. 2016.

YU Xiao. Study on precise feeding control system for dairy cattle[D]. Changchun; Jilin University, 2016. (in Chinese)

PORTO S M C, ARCIDIACONO C, GIUMMARRA A, et al. Localisation and identification performances of a real-time location system based on ultra wide band technology for monitoring and tracking dairy cow behaviour in a semi-open free-stall barn [J ]. Computers and Electronics in Agriculture, 2014,108; 221 -229.

YANG Zhen. Study on the statistics of cow activity and the detection method of estrus based on UWB positioning[ D]. Hohhot; Inner Mongolia University, 2019. (in Chinese)

GUO Qing. Positioning system for cows based on principle of ZigBee[D]. Ji'nan; Shandong University, 2014. (in Chinese)

ACHOUR B, BELKADI M, FILALI I, et al. Image analysis for individual identification and feeding behaviour monitoring of dairy cows based on convolutional neural networks (CNN)[J] . Biosystems Engineering, 2020,198; 31 -49.

ZHANG Qin, IIU Jiahui, REN Hailin. Experimental study on intelligent pushing method of feeding assistant robot[ J], Journal of South China University of Technology ( Natural Science Edition) , 2022,50(6) ; 111— 120. (in Chinese)

CAMPOS C, ELVIRA R, RODRIGUEZ J J G, et al. ORB — SLAM3; an accurate open-source library for visual, visual — inertial, and multimap SLAM [J ]. IEEE Transactions on Robotics, 2021 ,37(6) ; 1874 - 1890.

ENGEL J, SCHOPS T, CREMERS D. LSD - SLAM; large-scale direct monocular SLAM[C] //European Conference on Computer Vision. Springer, Cham, 2014; 834 -849.

PIRE T, FISCHER T, CASTRO G, et al. S — РТАМ; stereo parallel tracking and mapping[ J]. Robotics and Autonomous Systems, 2017,93; 27 -42.

ZHANG Z, ZHANG J, TANG Q. Mask R - CNN based semantic RGB - D SLAM for dynamic scenes[ С] //2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM). IEEE, 2019; 1151 -1156.

ZHAO X, ZUO T, HU X. OFM — SLAM; a visual semantic SLAM for dynamic indoor environments [J ]. Mathematical Problems in Engineering, 2021 ,2021 ; 1 - 16.

BESCOS B, FACIL J M, CIVERA J, et al. DynaSLAM ; tracking, mapping, and inpainting in dynamic scenes [ J ]. IEEE Robotics and Automation Letters, 2018,3(4) ; 4076 -4083.

BLOCH V, LEVIT H, HALACHMI I. Assessing the potential of photogrammetry to monitor feed intake of dairy cows [ J ]. Journal of Dairy Research, 2019,86( 1 ) ; 34 -39.

SHELLEY A N, LAU D L, STONE A E, et al. Short communication: measuring feed volume and weight by machine vision [J]. Journal of Dairy Science, 2016,99(1): 386 -391.

BEZEN R, EDAN Y, HALACHMI I. Computer vision system for measuring individual cow feed intake using RGB — D camera and deep learning algorithms[ J ]. Computers and Electronics in Agriculture, 2020,172: 105345.

YANG Cunzhi, LI Yuanyuan, YANG Xu, et al. The development of cow intelligent precise feeding robot of FR — 200 [J ] . Journal of Agricultural Mechanization Research, 2014,36(2) ; 120 - 122. (in Chinese)

WAN Chang, TAN Yu, ZHENG Yongjun, et al. Automatic charging of forage pushing robot by magnetic stripe navigation[ J]. Transactions of the Chinese Society for Agricultural Machinery, 201 8 ,49 ( Supp. ) ; 1 -7. (in Chinese)

HAN Zhenhao, LI Jia, YUAN Yanwei, et al. Path recognition of orchard visual navigation based on U — Net [J ]. Transactions of the Chinese Society for Agricultural Machinery,2021 ,52( 1 ) ; 30 -39. (in Chinese)

MENG Qingkuan, ZHANG Man, QIU Ruicheng, et al. Navigation line detection for farm machinery based on improved genetic algorithm [J]. Transactions of the Chinese Society for Agricultural Machinery ,2014 ,45 ( 10) ; 39 -46. (in Chinese)

BOLYA D, ZHOU C, XIAO F, et al. YOLACT; real-time instance segmentation [ С ] // Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019; 9157 -9166.

STURM J, ENGELHARD N, ENDRES F, et al. A benchmark for the evaluation of RGB - D SLAM systems [ С ] // 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2012: 573 -580.


Refbacks

  • There are currently no refbacks.