Browse > Article
http://dx.doi.org/10.7744/kjoas.20200006

Tillage boundary detection based on RGB imagery classification for an autonomous tractor  

Kim, Gookhwan (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Seo, Dasom (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Kim, Kyoung-Chul (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Hong, Youngki (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Lee, Meonghun (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Lee, Siyoung (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Kim, Hyunjong (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Ryu, Hee-Seok (Department of Agricultural Engineering, National Institute of Agricultural Sciences)
Kim, Yong-Joo (Department of Biosystems Mechanical Engineering, Chungnam National University)
Chung, Sun-Ok (Department of Biosystems Mechanical Engineering, Chungnam National University)
Lee, Dae-Hyun (Department of Biosystems Mechanical Engineering, Chungnam National University)
Publication Information
Korean Journal of Agricultural Science / v.47, no.2, 2020 , pp. 205-217 More about this Journal
Abstract
In this study, a deep learning-based tillage boundary detection method for autonomous tillage by a tractor was developed, which consisted of image cropping, object classification, area segmentation, and boundary detection methods. Full HD (1920 × 1080) images were obtained using a RGB camera installed on the hood of a tractor and were cropped to 112 × 112 size images to generate a dataset for training the classification model. The classification model was constructed based on convolutional neural networks, and the path boundary was detected using a probability map, which was generated by the integration of softmax outputs. The results show that the F1-score of the classification was approximately 0.91, and it had a similar performance as the deep learning-based classification task in the agriculture field. The path boundary was determined with edge detection and the Hough transform, and it was compared to the actual path boundary. The average lateral error was approximately 11.4 cm, and the average angle error was approximately 8.9°. The proposed technique can perform as well as other approaches; however, it only needs low cost memory to execute the process unlike other deep learning-based approaches. It is possible that an autonomous farm robot can be easily developed with this proposed technique using a simple hardware configuration.
Keywords
autonomous tractor; convolutional neural networks; deep learning; path detection; tractor tillage;
Citations & Related Records
Times Cited By KSCI : 6  (Citation Analysis)
연도 인용수 순위
1 Bakker T, van Asselt K, Bontsema J, Muller J, van Straten G. 2011. Autonomous navigation using a robot platform in a sugar beet field. Biosystems Engineering 109:357-368.   DOI
2 Bell T. 2000. Automatic tractor guidance using carrier-phase differential GPS. Computers and Electronics in Agriculture 25:53-66.   DOI
3 Chun CJ, Shim SB, Kang SM, Ryu SK. 2018. Development of evaluation of automatic pothole detection using fully convolutional neural networks. The Journal of the Korea Institute of Intelligent Transport Systems 17:55-64.
4 Dian Bah M, Hafiane A, Canals R. 2019. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 8:5189-5200.   DOI
5 Han XZ, Kim HJ, Jeon CW, Moom HC, Kim JH, Yi SY. 2019. Application of a 3D tractor-driving simulator for slip estimation-based path-tracking control of auto-guided tillage operation. Biosystems Engineering 178:70-85.   DOI
6 Han XZ, Kim HJ, Kim JY, Yi SY, Moom HC, Kim JH, Kim YJ. 2015. Path-tracking simulation and field tests for an autoguidance tillage tractor for a paddy field. Computers and Electronics in Agriculture 112:161-171.   DOI
7 Kamilaris A, Kartakoullis A, Prenafeta-Boldu FX. 2017. A review on the practice of big data analysis in agriculture. Computers and Electronics in Agriculture 143:23-37.   DOI
8 Kim WS, Baek SY, Kim TJ, Kim YS, Park SU, Choi CH, Hong SJ, Kim YJ. 2019. Work load analysis for determination of the reduction gear ratio for a 78 kW all wheel drive electric tractor design. Korean Journal of Agricultural Science 46:613-627. [in Korean]   DOI
9 Kim WS, Lee DH, Kim YJ. 2020. Machine vision-based automatic disease symptom detection of onion downy mildew. Computers and Electronics in Agriculture 168:105099.   DOI
10 Kim YJ, Chung SO, Choi CH. 2018. Development of automation technology for manual transmission of a 50 HP autonomous tractor. IFAC-PapersOnLine 51:20-22.
11 Krizhevsky A, Sutskever I, Hinton GE. 2012. Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems 25:1-9.
12 LeCun Y, Bottou L, Bengio Y, Haffner P. 1998. Gradient-based learning applied to document recognition. Proceeding of the IEEE 86:2278-2323.   DOI
13 Lee DH, Kim AK, Choi CH, Kim YJ. 2019. Study on image-based flock density evaluation of broiler chicks. Journal of Korea Institute of Information, Electronics, and Communication Technology 12:373-375. [in Korean]   DOI
14 Lenain R, Thuilot B, Cariou C, Martinet P. 2010. Mixed kinematic and dynamic sideslip angle observer for accurate control of fast off-road mobile robots. Journal of Field Robotics 27:181-196.
15 Li S, Xu H, Ji Y, Cao R, Zhang M, Li H. 2019. Development of a following agricultural machinery automatic navigation system. Computers and Electronics in Agriculture 158:335-344.   DOI
16 Shehata A, Mohammad S, Abdallah MSM, Ragab EM. 2015. A survey on hough transform, theory, techniques and applications. arXiv Preprprint 1502.02160.
17 Lutz W, Kc S. 2010. Dimensions of global population projections: What do we know about future population trends and structures? Philosophical Transactions of the Royal Society B 365:2779-2791.   DOI
18 Malavazi FBP, Guyonneau R, Fasquel JB, Lagrange S, Mercier F. 2018. LiDAR-only based navigation algorithm for an autonomous agricultural robot. Computers and Electronics in Agriculture 154:71-79.   DOI
19 Ming L, Imou K, Wakabayashi K, Yokoyama S. 2009. Review of research on agricultural vehicle autonomous guidance. International Journal of Agricultural and Biological Engineering 2:1-16.
20 Shalal N, Low T, McCarthy C, Hancock N. 2015. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion - Part B: Mapping and localisation. Computers and Electronics in Agriculture 119:267-278.   DOI
21 Stefas N, Bayram H, Isler V. 2019. Vision-based monitoring of orchards with UAVs. Computers and Electronics in Agriculture 163:104814.   DOI
22 Stombaugh TS, Bensen ER, Hummel JW. 1999. Guidance of agricultural vehicles at high field speeds. Transactions of the ASABE 42:537-544.
23 Wang Z, Underwood J, Walsh B. 2018. Machine vision assessment of mango orchard flowering. Computers and Electronics in Agriculture 151:501-511.   DOI
24 Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A. 2016. Learning deep features for discriminative localization. Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR):2921-2929.