DOI QR코드

DOI QR Code

Detection of Cropland in Reservoir Area by Using Supervised Classification of UAV Imagery Based on GLCM

GLCM 기반 UAV 영상의 감독분류를 이용한 저수구역 내 농경지 탐지

  • Received : 2018.08.02
  • Accepted : 2018.12.05
  • Published : 2018.12.31

Abstract

The reservoir area is defined as the area surrounded by the planned flood level of the dam or the land under the planned flood level of the dam. In this study, supervised classification based on RF (Random Forest), which is a representative machine learning technique, was performed to detect cropland in the reservoir area. In order to classify the cropland in the reservoir area efficiently, the GLCM (Gray Level Co-occurrence Matrix), which is a representative technique to quantify texture information, NDWI (Normalized Difference Water Index) and NDVI (Normalized Difference Vegetation Index) were utilized as additional features during classification process. In particular, we analyzed the effect of texture information according to window size for generating GLCM, and suggested a methodology for detecting croplands in the reservoir area. In the experimental result, the classification result showed that cropland in the reservoir area could be detected by the multispectral, NDVI, NDWI and GLCM images of UAV, efficiently. Especially, the window size of GLCM was an important parameter to increase the classification accuracy.

저수구역은 계획된 홍수위에 의하여 둘러싸인 지역 혹은 댐의 계획된 홍수위 내에 있는 지역으로 정의된다. 본 연구에서는 저수구역 내 농경지를 탐지하기 위하여, 대표적인 기계학습 기법인 RF (Random Forest) 기반의 감독 분류 방법을 적용하였다. 저수구역 내의 농경지를 효과적으로 분류하기 위하여, 질감정보를 정량화하기 위한 대표적인 기법인 GLCM (Gray Level Co-occurrence Matrix)과 NDWI (Normalized Difference Water Index), NDVI (Normalized Difference Vegetation Index)를 추가적인 입력자료로 활용하였다. 특히, 질감정보를 생성하는데 사용된 윈도우 크기가 농경지의 분류 정확도에 미치는 영향을 분석하여, 저수구역 내의 농경지를 효과적으로 분류하기 위한 방법론을 제시하였다. 실험결과, UAV 영상을 이용한 분류결과를 통하여 취득된 다중분광영상과 NDVI, NDWI, GLCM 영상들을 이용하여 저수구역 내의 농경지를 효과적으로 탐지할 수 있음을 확인하였다. 또한, GLCM의 윈도우 크기가 분류정확도를 향상시키기 위한 중요한 변수임을 확인하였다.

Keywords

GCRHBD_2018_v36n6_433_f0001.png 이미지

Fig. 1. Description of UAV and camera used for experiment

GCRHBD_2018_v36n6_433_f0002.png 이미지

Fig. 2. Experimental area

GCRHBD_2018_v36n6_433_f0003.png 이미지

Fig. 3. Workflow of RF algorithm

GCRHBD_2018_v36n6_433_f0004.png 이미지

Fig. 5. Example of cropland and grass area

GCRHBD_2018_v36n6_433_f0005.png 이미지

Fig. 6. GLCM result by study area

GCRHBD_2018_v36n6_433_f0006.png 이미지

Fig. 7. Example of ground truth data

GCRHBD_2018_v36n6_433_f0007.png 이미지

Fig. 8. Classification results according to window size of GLCM image

GCRHBD_2018_v36n6_433_f0008.png 이미지

Fig. 9. Detection results of cropland by classification according to window size of GLCM

GCRHBD_2018_v36n6_433_f0009.png 이미지

Fig. 10. 1st Detailed images of detection results by classification according to window size of GLCM

GCRHBD_2018_v36n6_433_f0010.png 이미지

Fig. 11. 2nd Detailed images of detection results by classification according to window size of GLCM

GCRHBD_2018_v36n6_433_f0011.png 이미지

Fig. 4. Example of NDWI image by UAV

Table 1. Specification of eBee and multiSPEC 4C

GCRHBD_2018_v36n6_433_t0001.png 이미지

Table 2. Confusion matrix of classification results using multispectral image

GCRHBD_2018_v36n6_433_t0002.png 이미지

Table 3. Confusion matrix of classification results using multispectral, NDVI and NDWI images

GCRHBD_2018_v36n6_433_t0003.png 이미지

Table 4. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 3)

GCRHBD_2018_v36n6_433_t0004.png 이미지

Table 5. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 5)

GCRHBD_2018_v36n6_433_t0005.png 이미지

Table 6. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 7)

GCRHBD_2018_v36n6_433_t0006.png 이미지

Table 7. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 15)

GCRHBD_2018_v36n6_433_t0007.png 이미지

Table 8. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 31)

GCRHBD_2018_v36n6_433_t0008.png 이미지

Table 9. Confusion matrix of classification results using multispectral, NDVI, NDWI and GLCM images (window size of GLCM : 63)

GCRHBD_2018_v36n6_433_t0009.png 이미지

References

  1. Choi, S.K., Kim, G.H., Choi, J.W., Lee, S.K., Jung, S.H., and Chun, S.J. (2017), UAV-based land cover mapping technique for monitoring coastal sand dunes, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 35, No. 1, pp. 11-22. https://doi.org/10.7848/KSGPC.2017.35.1.11
  2. Feng, Q., Liu, J., and Gong, J. (2015), UAV remote sensing for urban vegetation mapping using random forest and texture, Remote Sensing, Vol. 7, No. 1, pp. 1074-1094. https://doi.org/10.3390/rs70101074
  3. Kim, G.H. and Choi, J.W. (2017), Land cover classification with high spatial resolution using orthoimage and DSM based on fixed-wing UAV, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 35, No. 1, pp. 1-10. https://doi.org/10.7848/KSGPC.2017.35.1.1
  4. Kim, H., Kim, J., Yoon, S., and Kim, T. (2018), Development of a method for calculating the allowable storage capacity of rivers by using drone images, Korean Journal of Remote Sensing, Vol. 34, No. 2-1, pp. 203-211. (in Korean with English abstract) https://doi.org/10.7780/KJRS.2018.34.2.1.4
  5. Kim, H., Yoon, H., Jang, S., and Chung, Y. (2017), Detection method of river floating debris using unmanned aerial vehicle and multispectral sensors, Korean Journal of Remote Sensing, Vol. 33, No. 5-1, pp. 537-546. (in Korean with English abstract) https://doi.org/10.7780/kjrs.2017.33.5.1.7
  6. Mcfeeters, S. (1996), The use of normalized difference water index (NDWI) in the delineation of open water features, International Journal of Remote Sensing, Vol. 17, No. 7, pp. 1425-1432. https://doi.org/10.1080/01431169608948714
  7. Rouse, J.W., Haas, R.H., Schell, J.A., and Deering, D.W. (1974), Monitoring vegetation systems in the great plains with ERTS, Proceedings of the third Earth Resource Technology Satellite (ERTS) Symposium, NASA, 10-14 December, Washington, D.C, USA, Vol. 1, pp. 309-317.
  8. Su, L. and Gibeaut, J. (2017), Using UAS hyperspatial RGB imagery for identifying beach zones along the South Texas coast, Remote Sensing, Vol. 9, No. 2, p. 159. https://doi.org/10.3390/rs9020159
  9. Sung, S.M. and Lee, J.O. (2016), Accuracy of parcel boundary demarcation in agricultural area using UAVphotogrammetry, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 34, No. 1, pp. 53-62. (in Korean with English abstract) https://doi.org/10.7848/ksgpc.2016.34.1.53
  10. Van Der Linden, S., Rabe, A., Held, M., Jakimow, B., Leitao, P.J., Okujeni, A., Schwieder, M., Suess, S., and Hostert, P. (2015), The EnMAP-Box- A Toolbox and application programming interface for EnMAP data processing, Remote Sensing, Vol. 7, pp. 11249-11266. https://doi.org/10.3390/rs70911249
  11. Wood, E.M., Pidgeon, A.M., Radeloff, V.C., and Keuler, N.S. (2012), Image texture as a remotely sensed measure of vegetation structure, Remote Sensing of Environment, Vol. 121, pp. 516-526. https://doi.org/10.1016/j.rse.2012.01.003
  12. Xu, H. (2006), Modification of normalized difference water index (NDWI) to enhance open water features in remotely sensed imagery, International Journal of Remote Sensing, Vol. 27, No. 14, pp. 3025-3033. https://doi.org/10.1080/01431160600589179

Cited by

  1. Machine Learning for Tree Species Classification Using Sentinel-2 Spectral Information, Crown Texture, and Environmental Variables vol.12, pp.12, 2018, https://doi.org/10.3390/rs12122049