Browse > Article
http://dx.doi.org/10.7848/ksgpc.2020.38.6.671

Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Classification Upland Crop in Small Scale Agricultural Land  

Choi, Seokkeun (Dept. of Civil Engineering, Chungbuk National University)
Lee, Soungki (Terrapix)
Kang, Yeonbin (Dept. of Civil Engineering, Chungbuk National University)
Choi, Do Yeon (Terrapix)
Choi, Juweon (Dept. of Civil Engineering, Chungbuk National University)
Publication Information
Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography / v.38, no.6, 2020 , pp. 671-679 More about this Journal
Abstract
In order to increase the food self-sufficiency rate, monitoring and analysis of crop conditions in the cultivated area is important, and the existing measurement methods in which agricultural personnel perform measurement and sampling analysis in the field are time-consuming and labor-intensive for this reason inefficient. In order to overcome this limitation, it is necessary to develop an efficient method for monitoring crop information in a small area where many exist. In this study, RGB images acquired from unmanned aerial vehicles and vegetation index calculated using RGB image were applied as deep learning input data to classify complex upland crops in small farmland. As a result of each input data classification, the classification using RGB images showed an overall accuracy of 80.23% and a Kappa coefficient of 0.65, In the case of using the RGB image and vegetation index, the additional data of 3 vegetation indices (ExG, ExR, VDVI) were total accuracy 89.51%, Kappa coefficient was 0.80, and 6 vegetation indices (ExG, ExR, VDVI, RGRI, NRGDI, ExGR) showed 90.35% and Kappa coefficient of 0.82. As a result, the accuracy of the data to which the vegetation index was added was relatively high compared to the method using only RGB images, and the data to which the vegetation index was added showed a significant improvement in accuracy in classifying complex crops.
Keywords
Unmanned Aerial Vehicle; Vegetation Index; Deep Learning; Upland Crops; Classification;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Gallego, J., Kravchenko, A.N., Kussul, N.N., Skakun, S.V., Shelestov, A.Y., and Grypych, Y.A. (2012), Efficiency assessment of different approaches to crop classification based on satellite and ground observations, Journal of Automation and Information Sciences, Vol. 44, No. 5, pp. 67-80.   DOI
2 Gamon, J.A. and Surfus, J.S. (1999), Assessing leaf pigment content and activity with a reflectometer. The New Phytologist, Vol. 143, No. 1, pp. 105-117.   DOI
3 Ghosh, A., Ehrlich, M., Shah, S., Davis, L.S., and Chellappa, R. (2018), Stacked U-Nets for Ground Material Segmentation in Remote Sensing Imagery, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 18-22 June, Salt Lake City, United States, pp. 257-261.
4 Gu, Z., Cheng, J., Fu, H., Zhou, K., Hao, H., Zhao, Y., Zhang, T., Gao, S., and Liu, J. (2019), Ce-net: Context encoder network for 2d medical image segmentation. IEEE transactions on medical imaging, Vol. 38, NO. 10, pp. 2281-2292.   DOI
5 Huang, H., Lan, Y., Yang, A., Zhang, Y., Wen, S., and Deng, J. (2020), Deep learning versus Object-based Image Analysis (OBIA) in weed mapping of UAV imagery. International Journal of Remote Sensing, Vol. 41, No. 9, pp. 3446-3479.   DOI
6 Hunt, E.R., Cavigelli, M., Daughtry, C.S., Mcmurtrey, J.E., and Walthall, C.L. (2005), Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precision Agriculture, Vol. 6, No. 4, pp. 359-378.   DOI
7 Hutt, C., Koppe, W., Miao, Y., and Bareth, G. (2016), Best accuracy land use/land cover (LULC) classification to derive crop types using multitemporal, multisensor, and multi-polarization SAR satellite images, Remote sensing, Vol. 8, No. 8, 684p.   DOI
8 Jaeger, P.F., Kohl, S.A., Bickelhaupt, S., Isensee, F., Kuder, T. A., Schlemmer, H.P., and Maier-Hein, K.H. (2020), Retina U-Net: Embarrassingly simple exploitation of segmentation supervision for medical object detection. Machine Learning for Health Workshop, PMLR, 11 December, pp. 171-183.
9 MAFRA. (2016), https://www.mafra.go.kr/mafra/293/subview.do?enc=Zm5jdDF8QEB8JTJGYmJzJTJGbWFmcmElMkY2OCUyRjMxMzA0NyUyRmFydGNsVmlldy5kbyUzRg%3D%3D (last date accessed: 27 August 2020).
10 Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., and Stachniss, C. (2017), UAV-based crop and weed classification for smart farming, 2017 IEEE International Conference on Robotics and Automation (ICRA), 29 May-3 June, Singapore, pp. 3024-3031.
11 Meyer, G.E., Neto, J.C., Jones, D.D., and Hindman, T.W. (2004), Intensified fuzzy clusters for classifying plant, soil, and residue regions of interest from color images. Computers and electronics in agriculture, Vol. 42, No. 3, pp. 161-180.   DOI
12 NCIS. (2020), http://www.nics.go.kr/oneStopIndex/index.do (last date accessed: 27 August 2020).
13 Sishodia, R.P., Ray, R.L., and Singh, S.K. (2020), Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sensing, Vol. 12, No. 19, 3136p.   DOI
14 Neto, J.C. (2004), A combined statistical-soft computing approach for classification and mapping weed species in minimum-tillage systems, ProQuest, Michigan, U.S.
15 Ronneberger, O., Fischer, P., and Brox, T. (2015), U-net: Convolutional networks for biomedical image segmentation, International Conference on Medical image computing and computer-assisted intervention, 5-9 October, Munich, Germany, pp. 234-241.
16 RuBwurm, M. and Korner, M. (2017), Temporal vegetation modelling using long short-term memory networks for crop identification from medium-resolution multi-spectral satellite images, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 11-19.
17 Woebbecke, D.M., Meyer, G.E., Von Bargen, K., and Mortensen, D.A. (1995), Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE, Vol. 38, No. 1, pp. 259-269.   DOI
18 Chew, R., Rineer, J., Beach, R., O'Neil, M., Ujeneza, N., Lapidus, D., Miano, T., Hegarty-Craver, M., Polly, J., and Temple, D. S. (2020), Deep Neural Networks and Transfer Learning for Food Crop Identification in UAV Images. Drones, Vol. 4, No. 1, 7p.   DOI
19 Xiaoqin, W., Miaomiao, W., Shaoqiang, W., and Yundong, W. (2015), Extraction of vegetation information from visible unmanned aerial vehicle images. Transactions of the Chinese Society of Agricultural Engineering, Vol. 31, No. 5.
20 Barrero, O. and Perdomo, S.A. (2018), RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields, Precision Agriculture, Vol. 19, No. 5, pp. 809-822.   DOI
21 Choi, S.K., Lee, S.K., Kang, Y.B., Seong, S.K., Choi, D.Y., and Kim, G.H. (2020), Applicability of Image Classification Using Deep Learning in Small Area: Case of Agricultural Lands Using UAV Image. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 38, No. 1, pp. 23-33.   DOI