Multi-Cattle tracking with appearance and motion models in closed barns using deep learning

  • Han, Shujie (Department of Electronics Engineering, Jeonbuk National University) ;
  • Fuentes, Alvaro (Department of Electronics Engineering, Jeonbuk National University) ;
  • Yoon, Sook (Department of Computer Engineering, Mokpo National University) ;
  • Park, Jongbin (Department of Electronics Engineering, Jeonbuk National University) ;
  • Park, Dong Sun (Department of Electronics Engineering, Jeonbuk National University)
  • Received : 2022.09.01
  • Accepted : 2022.09.29
  • Published : 2022.09.30

Abstract

Precision livestock monitoring promises greater management efficiency for farmers and higher welfare standards for animals. Recent studies on video-based animal activity recognition and tracking have shown promising solutions for understanding animal behavior. To achieve that, surveillance cameras are installed diagonally above the barn in a typical cattle farm setup to monitor animals constantly. Under these circumstances, tracking individuals requires addressing challenges such as occlusion and visual appearance, which are the main reasons for track breakage and increased misidentification of animals. This paper presents a framework for multi-cattle tracking in closed barns with appearance and motion models. To overcome the above challenges, we modify the DeepSORT algorithm to achieve higher tracking accuracy by three contributions. First, we reduce the weight of appearance information. Second, we use an Ensemble Kalman Filter to predict the random motion information of cattle. Third, we propose a supplementary matching algorithm that compares the absolute cattle position in the barn to reassign lost tracks. The main idea of the matching algorithm assumes that the number of cattle is fixed in the barn, so the edge of the barn is where new trajectories are most likely to emerge. Experimental results are performed on our dataset collected on two cattle farms. Our algorithm achieves 70.37%, 77.39%, and 81.74% performance on HOTA, AssA, and IDF1, representing an improvement of 1.53%, 4.17%, and 0.96%, respectively, compared to the original method.

Keywords

Acknowledgement

This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture and Forestry(IPET) and Korea Smart Farm R&D Foundation(KosFarm) through Smart Farm Innovation Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs(MAFRA) and Ministry of Science and ICT(MSIT), Rural Development Administration(RDA)(421044-04)

References

  1. W. Andrew, C. Greatwood, and T. Burghardt, "Visual Localisation and Individual Identification of Holstein Friesian Cattle via Deep Learning," 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), pp. 2850-2859, Venice, Italy, Oct. 2017.
  2. D.W. Bailey, M.G. Trotter, C.W. Knight, and M.G. Thomas, "Use of GPS tracking collars and accelerometers for rangeland livestock production research," Translational Animal Science, Vol. 2, No. 1, pp. 81-88, Jan. 2018. https://doi.org/10.1093/tas/txx006
  3. W. Li, J.D. Bakker, Y. Li, S. Zheng, and F.Y. Li, "Applying a high-precision tracking system to distinguish the spatiotemporal patterns of animal movement in grassland ecology," Biological Conservation, Vol. 255, No. 109016, Mar. 2021.
  4. J. Salau, and J. Krieter, "Analysing the space-usage-pattern of a cow herd using video surveillance and automated motion detection," Biosystems Engineering, Vol. 197, pp. 122-134, Sep. 2020. https://doi.org/10.1016/j.biosystemseng.2020.06.015
  5. A. Zambelis, M. Saadati, G.M. Dallago, P. Stecko, V. Boyer, J.-P. Parent, M. Pedersoli, and E. Vasseur, "Automation of video-based location tracking tool for dairy cows in their housing stalls using deep learning," Smart Agricultural Technology, Vol. 1, No. 100015, pp. 122-134, Dec. 2021.
  6. T.T. Zin, M.Z. Pwint, P.T. Seint, S. Thant, S. Misawa, K. Sumi, and K. Yoshida, "Automatic Cow Location Tracking System Using Ear Tag Visual Analysis," Sensors, Vol. 20, No. 12, pp. 3564, Jun. 2020.
  7. P. Chen, "Dairy Cow Health Monitoring System Based on NB-IoT Communication," 2019 International Conference on Electronic Engineering and Informatics (EEI), pp. 393- 396, Nanjing, China, Nov. 2019.
  8. M. Gillenson, X. Zhang, A. Muthitacharoen, and P. Prasarnphanich, "I've Got You Under My Skin: The Past, Present, and Future Use of RFID Technology in People and Animals," Journal of Information Technology Management, Vol. 30, No. 2, pp. 19-29, 2019.
  9. Y. Qiao, H. Kong, C. Clark, S. Lomax, D. Su, S. Eiffert, and S. Sukkarieh, "Intelligent perception for cattle monitoring: A review for cattle identification, body condition score evaluation, and weight estimation," Computers and Electronics in Agriculture, Vol. 185, No. 106143, Jun. 2021.
  10. B. Meunier, P. Pradel, K.H. Sloth, C. Cirie, E. Delval, M.M. Mialon, and I. Veissier, "Image analysis to refine measurements of dairy cow behaviour from a real-time location system," Biosystems Engineering, Vol. 173, pp. 32-44, Sep. 2018. https://doi.org/10.1016/j.biosystemseng.2017.08.019
  11. H. Dogan, I.B. Basyigit, M. Yavuz, and S. Helhel, "Signal level performance variation of radio frequency identification tags used in cow body," International Journal of RF and Microwave Computer-Aided Engineering, Vol. 29, No. 7, Jan. 2019.
  12. S. Kumar, S. Tiwari, and S.K. Singh, "Face recognition for cattle," 2015 Third International Conference on Image Information Processing (ICIIP), pp. 65-72, Waknaghat, India, Dec. 2015.
  13. S. Kumar, S.K. Singh, and A.K. Singh, "Muzzle point pattern based techniques for individual cattle identification," IET Image Processing, Vol. 11, No. 10, pp. 805-814, Oct. 2017. https://doi.org/10.1049/iet-ipr.2016.0799
  14. Y. Lu, X. He, Y. Wen, and P.S.P. Wang, "A new cow identification system based on iris analysis and recognition," International Journal of Biometrics, Vol. 6, No. 1, pp. 18-32, Mar. 2014. https://doi.org/10.1504/IJBM.2014.059639
  15. A. Fuentes, S. Yoon, J. Park, and D.S. Park, "Deep learning-based hierarchical cattle behavior recognition with spatio-temporal information," Computers and Electronics in Agriculture, Vol. 177, No. 105627, Oct. 2020.
  16. A. Ter-Sarkisov, R. Ross, and J. Kelleher, "Bootstrapping Labelled Dataset Construction for Cow Tracking and Behavior Analysis," 2017 14th Conference on Computer and Robot Vision (CRV), pp. 277-284, Edmonton, Canada, May, 2017.
  17. Y. Hashimoto, H. Hama, and T.T. Zin, "Robust Tracking of Cattle Using Super Pixels and Local Graph Cut for Monitoring Systems," International Journal of Innovative Computing, Information and Control, Vol. 16, No. 4, pp. 1469-1475, Aug. 2020.
  18. C.A. Martinez-Ortiz, R.M. Everson, and T. Mottram, "Video tracking of dairy cows for assessing mobility scores," Joint European Conference on Precision Livestock Farming, Sep. 2013.
  19. N. Wojke, A. Bewley, and D. Paulus, "Simple online and realtime tracking with a deep association metric," 2017 IEEE International Conference on Image Processing (ICIP), pp. 3645-3649, Beijing, China, Sep. 2017.
  20. R. Sundararaman, C. De Almeida Braga, E. Marchand, and J. Pettre, "Tracking pedestrian heads in dense crowd," In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR), pp. 3865-3875, Nashville, USA, Jun. 2021.
  21. Z. Zou, , J. Huang, and P. Luo, "Compensation Tracker: Reprocessing Lost Object for Multi-Object Tracking," In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 307-317, Waikoloa, USA, Jan. 2022.
  22. V. Kocur, and M. Ftacnik, "Multi-class multi-movement vehicle counting based on CenterTrack," In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPRW), pp. 4009-4015, Nashville, USA, Jun. 2021.
  23. G. Wang, R. Gu, Z. Liu, W. Hu, M. Song, and J.N. Hwang, "Track without appearance: Learn box and tracklet embedding with local and global motion patterns for vehicle tracking," In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 9876-9886, Montreal, Canada, Oct. 2021.
  24. G. Jocher, A. Chaurasia, A. Stoken, J. Borovec, NanoCode012, Y. Kwon, TaoXie, J. Fang, imyhxy, K. Michael, V, A. Lorna, D. Montes, J. Nadar, Laughing, tkianai, yxNONG, P. Skalski, Z. Wang, ... M.T. Minh, ultralytics/yolov5: V6.1 - TensorRT, TensorFlow Edge TPU and OpenVINO Export and Inference, Zenodo, 2022.
  25. A. Bewley, Z. Ge, L. Ott, F. Ramos, and B. Upcroft, "Simple online and realtime tracking," 2016 IEEE International Conference on Image Processing (ICIP), pp. 3464-3468, Phoenix, USA, Sep. 2016.
  26. J. Luiten, A. Osep, P. Dendorfer, P. Torr, A. Geiger, L. Leal-Taixe, and B. Leibe, "Hota: A higher order metric for evaluating multi-object tracking," International journal of computer vision, Vol. 129, No. 2, pp. 548-578, Feb. 2021. https://doi.org/10.1007/s11263-020-01375-2
  27. E. Ristani, F. Solera, R. Zou, R. Cucchiara, and C. Tomasi, "Performance measures and a data set for multi-target, multi-camera tracking," In European conference on computer vision, pp. 17-35, Oct. 2016.
  28. K. Bernardin, and R. Stiefelhagen, "Evaluating multiple object tracking performance: the clear mot metrics," EURASIP Journal on Image and Video Processing, Vol. 2008, No. 246309, pp. 1-10, Apr. 2008.
  29. J.-H. Bang, J. Park, S.-W. Park, "A System for Determining the Growth Stage of Fuit Tree Using a Deep Learning-Based Object Detection Model," Smart Media Journal, Vol. 11, No. 4, pp. 9-18, May, 2022.
  30. K.H. Oh, S. Kim, I. Na, Y.C. Kim, C. Moon, "Gestures Recognition for Smart Device using Contactless Electronic Potential Sensor," Smart Media Journal, Vol. 3, No. 2, pp.14-19, Jun. 2014
  31. H.B. Jang, C.W. Lee, "Multi-Region based Radial GCN algorithm for Human action Recognition," Smart Media Journal, Vol.11, No.1, pp.46-57, Feb. 2022.