DOI QR코드

DOI QR Code

Tillage boundary detection based on RGB imagery classification for an autonomous tractor

  • Kim, Gookhwan (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Seo, Dasom (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Kim, Kyoung-Chul (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Hong, Youngki (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Lee, Meonghun (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Lee, Siyoung (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Kim, Hyunjong (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Ryu, Hee-Seok (Department of Agricultural Engineering, National Institute of Agricultural Sciences) ;
  • Kim, Yong-Joo (Department of Biosystems Mechanical Engineering, Chungnam National University) ;
  • Chung, Sun-Ok (Department of Biosystems Mechanical Engineering, Chungnam National University) ;
  • Lee, Dae-Hyun (Department of Biosystems Mechanical Engineering, Chungnam National University)
  • Received : 2020.01.08
  • Accepted : 2020.02.21
  • Published : 2020.06.01

Abstract

In this study, a deep learning-based tillage boundary detection method for autonomous tillage by a tractor was developed, which consisted of image cropping, object classification, area segmentation, and boundary detection methods. Full HD (1920 × 1080) images were obtained using a RGB camera installed on the hood of a tractor and were cropped to 112 × 112 size images to generate a dataset for training the classification model. The classification model was constructed based on convolutional neural networks, and the path boundary was detected using a probability map, which was generated by the integration of softmax outputs. The results show that the F1-score of the classification was approximately 0.91, and it had a similar performance as the deep learning-based classification task in the agriculture field. The path boundary was determined with edge detection and the Hough transform, and it was compared to the actual path boundary. The average lateral error was approximately 11.4 cm, and the average angle error was approximately 8.9°. The proposed technique can perform as well as other approaches; however, it only needs low cost memory to execute the process unlike other deep learning-based approaches. It is possible that an autonomous farm robot can be easily developed with this proposed technique using a simple hardware configuration.

Keywords

References

  1. Bakker T, van Asselt K, Bontsema J, Muller J, van Straten G. 2011. Autonomous navigation using a robot platform in a sugar beet field. Biosystems Engineering 109:357-368. https://doi.org/10.1016/j.biosystemseng.2011.05.001
  2. Bell T. 2000. Automatic tractor guidance using carrier-phase differential GPS. Computers and Electronics in Agriculture 25:53-66. https://doi.org/10.1016/S0168-1699(99)00055-1
  3. Chun CJ, Shim SB, Kang SM, Ryu SK. 2018. Development of evaluation of automatic pothole detection using fully convolutional neural networks. The Journal of the Korea Institute of Intelligent Transport Systems 17:55-64.
  4. Dian Bah M, Hafiane A, Canals R. 2019. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 8:5189-5200. https://doi.org/10.1109/access.2019.2960873
  5. Han XZ, Kim HJ, Jeon CW, Moom HC, Kim JH, Yi SY. 2019. Application of a 3D tractor-driving simulator for slip estimation-based path-tracking control of auto-guided tillage operation. Biosystems Engineering 178:70-85. https://doi.org/10.1016/j.biosystemseng.2018.11.003
  6. Han XZ, Kim HJ, Kim JY, Yi SY, Moom HC, Kim JH, Kim YJ. 2015. Path-tracking simulation and field tests for an autoguidance tillage tractor for a paddy field. Computers and Electronics in Agriculture 112:161-171. https://doi.org/10.1016/j.compag.2014.12.025
  7. Kamilaris A, Kartakoullis A, Prenafeta-Boldu FX. 2017. A review on the practice of big data analysis in agriculture. Computers and Electronics in Agriculture 143:23-37. https://doi.org/10.1016/j.compag.2017.09.037
  8. Kim WS, Baek SY, Kim TJ, Kim YS, Park SU, Choi CH, Hong SJ, Kim YJ. 2019. Work load analysis for determination of the reduction gear ratio for a 78 kW all wheel drive electric tractor design. Korean Journal of Agricultural Science 46:613-627. [in Korean] https://doi.org/10.7744/KJOAS.20190047
  9. Kim WS, Lee DH, Kim YJ. 2020. Machine vision-based automatic disease symptom detection of onion downy mildew. Computers and Electronics in Agriculture 168:105099. https://doi.org/10.1016/j.compag.2019.105099
  10. Kim YJ, Chung SO, Choi CH. 2018. Development of automation technology for manual transmission of a 50 HP autonomous tractor. IFAC-PapersOnLine 51:20-22.
  11. Krizhevsky A, Sutskever I, Hinton GE. 2012. Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems 25:1-9.
  12. LeCun Y, Bottou L, Bengio Y, Haffner P. 1998. Gradient-based learning applied to document recognition. Proceeding of the IEEE 86:2278-2323. https://doi.org/10.1109/5.726791
  13. Lee DH, Kim AK, Choi CH, Kim YJ. 2019. Study on image-based flock density evaluation of broiler chicks. Journal of Korea Institute of Information, Electronics, and Communication Technology 12:373-375. [in Korean] https://doi.org/10.17661/JKIIECT.2019.12.4.373
  14. Lenain R, Thuilot B, Cariou C, Martinet P. 2010. Mixed kinematic and dynamic sideslip angle observer for accurate control of fast off-road mobile robots. Journal of Field Robotics 27:181-196.
  15. Li S, Xu H, Ji Y, Cao R, Zhang M, Li H. 2019. Development of a following agricultural machinery automatic navigation system. Computers and Electronics in Agriculture 158:335-344. https://doi.org/10.1016/j.compag.2019.02.019
  16. Lutz W, Kc S. 2010. Dimensions of global population projections: What do we know about future population trends and structures? Philosophical Transactions of the Royal Society B 365:2779-2791. https://doi.org/10.1098/rstb.2010.0133
  17. Malavazi FBP, Guyonneau R, Fasquel JB, Lagrange S, Mercier F. 2018. LiDAR-only based navigation algorithm for an autonomous agricultural robot. Computers and Electronics in Agriculture 154:71-79. https://doi.org/10.1016/j.compag.2018.08.034
  18. Ming L, Imou K, Wakabayashi K, Yokoyama S. 2009. Review of research on agricultural vehicle autonomous guidance. International Journal of Agricultural and Biological Engineering 2:1-16.
  19. Shalal N, Low T, McCarthy C, Hancock N. 2015. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion - Part B: Mapping and localisation. Computers and Electronics in Agriculture 119:267-278. https://doi.org/10.1016/j.compag.2015.09.026
  20. Shehata A, Mohammad S, Abdallah MSM, Ragab EM. 2015. A survey on hough transform, theory, techniques and applications. arXiv Preprprint 1502.02160.
  21. Stefas N, Bayram H, Isler V. 2019. Vision-based monitoring of orchards with UAVs. Computers and Electronics in Agriculture 163:104814. https://doi.org/10.1016/j.compag.2019.05.023
  22. Stombaugh TS, Bensen ER, Hummel JW. 1999. Guidance of agricultural vehicles at high field speeds. Transactions of the ASABE 42:537-544.
  23. Wang Z, Underwood J, Walsh B. 2018. Machine vision assessment of mango orchard flowering. Computers and Electronics in Agriculture 151:501-511. https://doi.org/10.1016/j.compag.2018.06.040
  24. Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A. 2016. Learning deep features for discriminative localization. Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR):2921-2929.

Cited by

  1. Weakly Supervised Crop Area Segmentation for an Autonomous Combine Harvester vol.21, pp.14, 2020, https://doi.org/10.3390/s21144801