Browse > Article
http://dx.doi.org/10.5389/KSAE.2022.64.1.015

Development of 3D Crop Segmentation Model in Open-field Based on Supervised Machine Learning Algorithm  

Jeong, Young-Joon (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Lee, Jong-Hyuk (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Lee, Sang-Ik (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Oh, Bu-Yeong (Division of Soil and Fertilizer, National Institute of Agricultural Sciences, Rural Development Administration)
Ahmed, Fawzy (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Seo, Byung-Hun (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Kim, Dong-Su (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Seo, Ye-Jin (Department of Rural Systems Engineering, Global Smart Farm Convergence Major, Seoul National University)
Choi, Won (Department of Landscape Architecture and Rural Systems Engineering, Research Institute of Agriculture and Life Sciences, Global Smart Farm Convergence Major, College of Agriculture and Life Sciences, Seoul National University)
Publication Information
Journal of The Korean Society of Agricultural Engineers / v.64, no.1, 2022 , pp. 15-26 More about this Journal
Abstract
3D open-field farm model developed from UAV (Unmanned Aerial Vehicle) data could make crop monitoring easier, also could be an important dataset for various fields like remote sensing or precision agriculture. It is essential to separate crops from the non-crop area because labeling in a manual way is extremely laborious and not appropriate for continuous monitoring. We, therefore, made a 3D open-field farm model based on UAV images and developed a crop segmentation model using a supervised machine learning algorithm. We compared performances from various models using different data features like color or geographic coordinates, and two supervised learning algorithms which are SVM (Support Vector Machine) and KNN (K-Nearest Neighbors). The best approach was trained with 2-dimensional data, ExGR (Excess of Green minus Excess of Red) and z coordinate value, using KNN algorithm, whose accuracy, precision, recall, F1 score was 97.85, 96.51, 88.54, 92.35% respectively. Also, we compared our model performance with similar previous work. Our approach showed slightly better accuracy, and it detected the actual crop better than the previous approach, while it also classified actual non-crop points (e.g. weeds) as crops.
Keywords
Open-field farm; point cloud model; supervised learning; precision agriculture; 3D image segmentation;
Citations & Related Records
Times Cited By KSCI : 5  (Citation Analysis)
연도 인용수 순위
1 Jurado, J. M., L. Ortega, J. J. Cubillas, and F. R. Feito, 2020. Multispectral mapping on 3D models and multi-temporal monitoring for individual characterization of olive trees. Remote Sensing 12(7): 1106. doi:10.3390/rs12071106.   DOI
2 Sarron, J., E. Malezieux, C. A. B. Sane, and E. Faye, 2018. Mango yield mapping at the orchard scale based on tree structure and land cover assessed by UAV. Remote Sensing 10(12): 1900. doi:10.3390/rs10121900.   DOI
3 Tjahjadi, M., and F. D. Agustina, 2019. Fast and Stable DirectRelative Orientation of UAV-based Stereo Pair. International Journal of Advances in Intelligent Informatics 5(1): 24-39. doi:10.26555/ijain.v5i1.327.   DOI
4 Torres-Sanchez, J., A. I. De Castro, J. M. Pena, F. M. Jimenez-Brenes, O. Arquero, M. Lovera, and F. Lopez-Granados, 2018. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosystems Engineering 176: 172-184. doi:10.1016/j.biosystemseng.2018.10.018.   DOI
5 Yang, S., L. Zheng, W. Gao, B. Wang, X. Hao, J. Mi, and M. Wang, 2020. An efficient processing approach for colored point cloud-based high-throughput seedling phenotyping. Remote Sensing 12(10): 1540. doi:10.3390/rs12101540.   DOI
6 Zhang, C. L., K. F. Zhang, L. Z. Ge, K. L. Zou, S. Wang, J. X. Zhang, and W. Li, 2021. A method for organs classification and fruit counting on pomegranate trees based on multi-features fusion and support vector machine by 3D point cloud. Scientia Horticulturae 278: 109791. doi:10.1016/j.scienta.2020.109791.   DOI
7 Zhou, Q. Y., J. S. Park, and V. Koltun, 2018. Open3D: A modern library for 3D data processing. arXiv: 1801.09847v1 [cs.CV].
8 Sarabia, R., A. Aquino, J. M. Ponce, G. Lopez, and J. M. Andujar, 2020. Automated identification of crop tree crowns from UAV multispectral imagery by means of morphological image analysis. Remote Sensing 12(5): 748. doi:10.3390/rs12050748.   DOI
9 Fawcett, D., B. Azlan, T. C. Hill, L. K. Kho, J. Bennie, and K. Anderson, 2019. Unmanned aerial vehicle (UAV) derived structure-from-motion photogrammetry point clouds for oil palm (Elaeis guineensis) canopy segmentation and height estimation. International Journal of Remote Sensing 40(19): 7538-7560. doi:10.1080/01431161.2019.1591651.   DOI
10 Kim, H. G., H. S. Yun, and J. M. Cho, 2015. Analysis of 3D accuracy according to determination of calibration initial value in close-range digital photogrammetry using VLBI antenna and mobile phone camera. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography 33(1): 31-43 (in Korean). doi:10.7848/ksgpc.2015.33.1.31.   DOI
11 Pi, W., J. Du, Y. Bi, X. Gao, and X. Zhu, 2021. 3D-CNN based UAV hyperspectal imagery for grassland degradation indicator ground object classification research. Ecological Informatics 62: 101278. doi:10.1016/j.ecoinf.2021.101278.   DOI
12 Xu, Z., K. Guan, N. Casler, B. Peng, and S. Wang, 2018. A 3D convolutional neural network method for land cover classification using LiDAR and multi-temporal Landsat imagery. ISPRS Journal of Photogrammetry and Remote Sensing 144(6): 423-434. doi:10.1016/j.isprsjprs.2018.08.005   DOI
13 Choi, S. K., S. K. Lee, Y. B. Kang, D. Y. Choi, and J. W. Choi, 2020. Use of unmanned aerial vehicle imagery and deep learning UNet to classification upland crop in small scale agricultural land. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography 38(6): 671-679 (in Korean). doi:10.7848/ksgpc.2020.38.6.671.   DOI
14 Ziliani, M. G., S. D. Parkes, I. Hoteit, and M. F. McCabe, 2018. Intra-season crop height variability at commercial farm scales using a fixed-wing UAV. Remote Sensing 10(12): 2007. doi:10.3390/rs10122007.   DOI
15 Costa, L., L. Nunes, and Y. Ampatzidis, 2020. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Computers and Electronics in Agriculture 172: 105334. doi:10.1016/j.compag.2020.105334.   DOI
16 De Castro, A. I., F. M. Jimenez-Brenes, J. Torres-Sanchez, J. M. Pena, I. Borra-Serrano, and F. Lopez-Granados, 2018. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sensing 10(4): 584. doi:10.3390/rs10040584.   DOI
17 Dino, H. I., and M. B. Abdulrazzaq, 2019. Facial expression classification based on SVM, KNN and MLP classifiers. In 2019 International Conference on Advanced Science and Engineering (ICOASE), 70-75. Zakho - Duhok, Iraq: IEEE. doi:10.1109/ICOASE.2019.8723728.   DOI
18 Ministry of Agriculture, Food and Rural Affairs, Agricultural business registration information inquiry site. https://uni.agrix.go.kr/docs7/biOlap/dashBoard.do. 2021.
19 Ferreira, A. D. S., D. M. Freitas, G. G. D. Silva, H. Pistori, and M. T. Folhes, 2017. Weed detection in soybean crops using ConvNets. Computers and Electronics in Agriculture 143: 314-324. doi:10.1016/j.compag.2017.10.027.   DOI
20 Gee, C., J. Bossu, G. Jones, and F. Truchetet, 2008. Crop/weed discrimination in perspective agronomic images. Computers and Electronics in Agriculture 60(1): 49-59. doi:10.1016/j.compag.2007.06.003.   DOI
21 Hunt, E. R., W. D. Hively, S. J. Fujikawa, D. S. Linden, C. S. Daughtry, and G. W. McCarty, 2010. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sensing 2(1): 290-305. doi:10.3390/rs2010290.   DOI
22 Park, J. K., A. Das, and J. H. Park, 2015. Application trend of unmanned aerial vehicle (UAV) image in agricultural sector: Review and proposal. CNU Journal of Agricultural Science 42(3): 269-276 (in Korean). doi:10.7744/cnujas.2015.42.3.269.   DOI
23 Mortensen, A. K., A. Benderb, B. Whelanc, M. M. Barbourc, S. Sukkariehb, H. Karstoftd, and R. Gisluma, 2018. Segmentation of lettuce in coloured 3D point clouds for fresh weight estimation. Computers and Electronics in Agriculture 154: 373-381. doi:10.1016/j.compag.2018.09.010.   DOI
24 Na, S., C. Park, K. So, H. Ahn, and K. Lee, 2019. Selection on optimal bands to estimate yield of the chinese cabbage using drone-based hyperspectral image. Korean Journal of Remote Sensing 35(3): 375-387 (in Korean). doi:10.7780/kjrs.2019.35.3.3.   DOI
25 Pan, Y., Y. Han, L. Wang, J. Chen, H. Meng, G. Wang, Z. Zhang, and S. Wang, 2019. 3D reconstruction of ground crops based on airborne LiDAR technology. IFAC-Papers On Line 52(24): 35-40. doi:10.1016/j.ifacol.2019.12.376.   DOI
26 Mohamad, F. S., M. Iqtait, and F. Alsuhimat, 2018. Age prediction on face features via multiple classifiers. In 2018 4th International Conference on Computer and Technology Applications (ICCTA), 161-166. Istanbul, Turkey: IEEE. doi:10.1109/CATA.2018.8398675.   DOI
27 Ahn, H. R., J. I. Kim, and T. J. Kim, 2014. 3D accuracy analysis of mobile phone-based stereo images. Journal of the Korean Society of Broad Engineers 19(5): 677-686 (in Korean). doi:10.5909/JBE.2014.19.5.677.   DOI
28 Albornoz, C., and L. F. Giraldo, 2018. Trajectory design for efficient crop irrigation with a UAV. 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), 1-6. Cartagena, Colombia: IEEE. doi:10.1109/CCAC.2017.8276401.   DOI
29 Bah, M. D., A. Hafiane, and R. Canals, 2017. Weeds detection in UAV imagery using SLIC and the hough transform. 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), 1-6. Montreal, QC, Canada: IEEE. doi:10.1109/IPTA.2017.8310102.   DOI
30 Ampatzidis, Y., V. Partel, and L. Costa, 2020. Agroview: Cloud-based application to process, analyze and visualize UAV-collected data for precision agriculture applications utilizing artificial intelligence. Computers and Electronics in Agriculture 174: 105457. doi:10.1016/j.compag.2020.105457.   DOI
31 Chiu, M. T., X. Xu, Y. Wei, Z. Huang, A. Schwing, R. Brunner, H. Khachatrian, H. Karapetyan, I. Dozier, G. Rose, D. Wilson, A. Tudor, N. Hovakimyan, T. S. Huang, and H. Shi, 2020. Agriculture-vision: A large aerial image database for agricultural pattern analysis. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2825-2835.
32 Comba, L., A. Biglia, D. R. Aimonino, and P. Gay, 2018. Unsupervised detection of vineyards by 3D point-cloud UAV photogrammetry for precision agriculture. Computers and Electronics in Agriculture 155: 84-95. doi:10.1016/j.compag.2018.10.005.   DOI
33 Commercializations Promotion Agency for R&D Outcomes, 2019. Smart farm technology and market trend report. 4-14. Seoul.
34 Smith, M. W., J. L. Carrivick, and D. J. Quincey, 2015. Structure from motion photogrammetry in physical geography. Progress in Physical Geography: Earth and Environment 40(2): 247-275. doi:10.1177/0309133315615805.   DOI
35 Angin, P., M. H. Anisi, F. Goksel, C. Gursoy, and A. Buyukgulcu, 2020. AgriLoRa: A digital twin framework for smart agriculture. Journal of Wireless Mobile Networks Ubiquitous Computing and Dependable Applications 11: 77-96. doi:10.22667/JOWUA.2020.12.31.077.   DOI
36 Kerkech, M., A. Hafiane, and R. Canals, 2020. VddNet: Vine disease detection network based on multispectral images and depth map. Remote Sensing 12(20): 3305. doi:10.3390/rs12203305.   DOI
37 Sager, C., P. Zschech, and N. Kuhl, 2021. labelCloud: A lightweight domain-independent labeling tool for 3D object detection in point clouds. arXiv: 2103.04970 [cs.CV].
38 Bzdok, D., M. Krzywinski, and N. Altman, 2018. Machine learning: supervised methods. Nature Methods 15(1): 5. doi:10.1038/nmeth.4551.   DOI
39 Jay, S., G. Rabatel, X. Hadoux, D. Moura, and N. Gorretta, 2015. In-field crop row phenotyping from 3D modeling performed using Structure from Motion. Computers and Electronics in Agriculture 110: 70-77. doi:10.1016/j.compag.2014.09.021.   DOI
40 Jimenez-Brenes, F. M., F. Lopez-Granados, A. I. de Castro, J. Torres-Sanchez, N. Serrano, and J. M. Pena, 2017. Quantifying pruning impacts on olive tree architecture and annual canopy growth by using UAV-based 3D modeling. Plant Methods 13: 55. doi:10.1186/s13007-017-0205-3.   DOI
41 Mathews, A. J., and J. L. R. Jensen, 2013. Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sensing 5(5): 2164-2183. doi:10.3390/rs5052164.   DOI
42 Kim, K. Y., and S. I. Na, 2019. Vegetation map service system using UAV imagery and sample field data on major cultivation regions. Journal of Korean Society for Geospatial Information System 27(1): 33-41. doi:10.7319/kogsis.2019.27.1.033.   DOI
43 Kim, S. K., and J. G. Ahn, 2020. Data mining based forest fires prediction models using meteorological data. Journal of the Korea Academia-Industrial 21(8): 521-529 (in Korean). doi:10.5762/KAIS.2020.21.8.521.   DOI
44 Long, H., Z. Zhang, and Y. Su, 2014. Analysis of daily solar power prediction with data-driven approaches. Applied Energy, 126, 29-37. doi:10.1016/j.apenergy.2014.03.084.   DOI
45 Mesas-Carrascosa, F., A. I. De Castro, J. Torres-Sanchez, P. Trivino-Tarradas, F. M. Jimenez-Brenes, A. Garcia-Ferrer, and F. Lopez-Granados, 2020. Classification of 3D point clouds using color vegetation indices for precision viticulture and digitizing applications. Remote Sensing 12(2): 317. doi:10.3390/rs12020317.   DOI