Browse > Article
http://dx.doi.org/10.7780/kjrs.2022.38.6.1.37

Assessment of Lodged Damage Rate of Soybean Using Support Vector Classifier Model Combined with Drone Based RGB Vegetation Indices  

Lee, Hyun-jung (Department of Rural and Agricultural Engineering, Chungbuk National University)
Go, Seung-hwan (Department of Rural and Agricultural Engineering, Chungbuk National University)
Park, Jong-hwa (Department of Rural and Agricultural Engineering, Chungbuk National University)
Publication Information
Korean Journal of Remote Sensing / v.38, no.6_1, 2022 , pp. 1489-1503 More about this Journal
Abstract
Drone and sensor technologies are enabling digitalization of agricultural crop's growth information and accelerating the development of the precision agriculture. These technologies could be able to assess damage of crops when natural disaster occurs, and contribute to the scientification of the crop insurance assessment method, which is being conducted through field survey. This study was aimed to calculate lodged damage rate from the vegetation indices extracted by drone based RGB images for soybean. Support Vector Classifier (SVC) models were considered by adding vegetation indices to the Crop Surface Model (CSM) based lodged damage rate. Visible Atmospherically Resistant Index (VARI) and Green Red Vegetation Index (GRVI) based lodged damage rate classification were shown the highest accuracy score as 0.709 and 0.705 each. As a result of this study, it was confirmed that drone based RGB images can be used as a useful tool for estimating the rate of lodged damage. The result acquired from this study can be used to the satellite imagery like Sentinel-2 and RapidEye when the damages from the natural disasters occurred.
Keywords
Drone; Support vector classifier; Vegetation index; Lodging; Soybean;
Citations & Related Records
Times Cited By KSCI : 5  (Citation Analysis)
연도 인용수 순위
1 AFPS, 2022. Agricultural Policy Insurance & Finance Service, http://www.apfs.kr, Accessed on Dec. 6, 2022.
2 Benincasa, P., A. Sara, B. Luca, C. A. Fabbri, N. Antonio, S. Velia, M. Gianluca, G. Marcello, T. Francesco, and V. Marco, 2018. Reliability of NDVI derived by high resolution satellite and UAV compared to in-field methods for the evaluation of early crop N status and grain yield in wheat, Experimental Agriculture, 54: 604-622. http://dx.doi.org/10.1017/S0014479717000278   DOI
3 Bendig, J., B. Andereas, and B. Georg, 2013. UAV-based Imaging for Multi-Temporal, very high Resolution Crop Surface Models to monitor Crop Growth Variability, Photogrammetrie Fernerkundung Geoinformation, 6: 551-562. http://dx.doi.org/10.1127/1432-8364/2013/0200   DOI
4 Bendig, J., K. Yu, H. Aasen, A. Bolten, S. Bennertz, J. Broscheit, M.L. Gnyp, and G. Bareth, 2015, Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley, International Journal of Applied Earth Observation and Geoinformation, 39: 79-87. https://doi.org/10.1016/j.jag.2015.02.012   DOI
5 Bernhard, E.B., M.G. Isabelee, and N.V. Vladimir, 1992. A training algorithm for optimal margin classifiers, Proc. of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, Jul. 27-29, pp. 144-152. https://doi.org/10.1145/130385.130401   DOI
6 Jeong, C.H., S.H. Go, and J.H. Park, 2022. Classification of Fall Crops Using Unmanned Aerial Vehicle Based Image and Support Vector Machine Model - Focusing on Idam-ri, Goesan-gun, Chungcheongbuk-do -, Journal of the Korean Society of Rural Planning, 28(1): 057-069 (in Korean with English abstract). https://doi.org/10.7851/ksrp.2022.28.1.057   DOI
7 Kwak, G.H. and N.W. Park, 2022. Comparison of Deep Learning-based Unsupervised Domain Adaptation Models for Crop Classification, Korean Journal of Remote Sensing, 38(2): 199-213 (in Korean with English abstract). https://doi.org/10.7780/kjrs.2022.38.2.6   DOI
8 Liu, T., R. Li, X. Zhong, M. Jiang, X. Jin, P. Zhou, S. Liu, C. Sun, and W. Guo, 2018. Estimates of rice lodging using indices from UAV visible and thermal infrared images, Agricultural and Forest Meteorology, 252: 144-154. https://doi.org/10.1016/j.agrformet.2018.01.021   DOI
9 Louhaichi, M., M. Borman, and D. Johnson, 2001. Spatially located platform and aerial photography for documentation of grazing impacts on wheat, Geocarto International, 16(1): 65-70. https://doi.org/10.1080/10106040108542184   DOI
10 Berni, J.A.J., P.J. Zarco-Tejada, L. Suarez, and E. Fereres, 2009. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle, IEEE Transactions of Geoscience and Remote Sensing, 48(3): 722-738. https://doi.org/10.1109/TGRS.2008.2010457   DOI
11 Justice, C.O., J.R.G. Townshend, B.N. Holben, and C. J. Tucker, 1985. Analysis of the phenology of global vegetation using meteorological satellite data, International Journal of Remote Sensing, 6: 1271-1318. https://doi.org/10.1080/01431168508948281   DOI
12 Kawashima, S. and M. Nakatani, 1998. An algorithm for estimating chlorophyll content in leaves using a video camera, Annals of Botany, 81(1): 49-54. https://doi.org/10.1006/anbo.1997.0544   DOI
13 Na, S.I., C.W. Park, K.H. So, H.Y. Ahn, and K.D. Lee, 2019. Selection on Optimal Bands to Estimate Yield of the Chinese Cabbage Using Drone based Hyperspectral Image, Korean Journal of Remote Sensing, 35(3): 375-387 (in Korean with English abstract). https://doi.org/10.7780/kjrs.2019.35.3.3   DOI
14 Karakizi, C., M. Oikonomou, and K. Karantzalos, 2016. Vineyard detection and vine variety discrimination from very high resolution satellite data, Remote Sensing, 8: 1-25. https://doi.org/10.3390/rs8030235   DOI
15 Mao, W., Y. Wang, and Y. Wang, 2003. Real-time detection of between-row weeds using machine vision, Proc. of 2003 ASAE (American Society of Agricultural Engineers) Annual Meeting, Las Vegas, NV, USA, Jul. 27-30, p. 1. https://doi.org/10.13031/2013.15381   DOI
16 Ning, L., Z. Jie, H. Zixu, L. Dong, C. Qiang, Y. Xia, T. Yongchao, Z. Yan, C. Weixing, and C. Tao, 2019. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system, Plant Methods, 15: 17. https://doi.org/10.1186/s13007-019-0402-3   DOI
17 Lu, D. and Q. Weng, 2007. A survey of image classification methods and techniques for improving classification performance, International Journal of Remote Sensing, 28: 823-870. https://doi.org/10.1080/01431160600746456   DOI
18 Meyer, G. and J. Neto, 2008. Verification of color vegetation indices for automated crop imaging applications, Computers and Electronics in Agriculture, 63(2): 282-293. https://doi.org/10.1016/j.compag.2008.03.009   DOI
19 Marcelo, C.F.W. and P.M. Jose, 2020. Soybean Yield Estimation and Its Components: A Linear Regression Approach, Agriculture, 10(8): 348. https://doi.org/10.3390/agriculture10080348   DOI
20 Na, S.I., C.W. Park, K.H. So, H.Y. Ahn, and K.D. Lee, 2018. Development of Biomass Evaluation Model of Winter Crop Using RGB Imagery Based on Unmanned Aerial Vehicle, Korean Journal of Remote Sensing, 34(5): 709-720 (in Korean with English abstract). https://doi.org/10.7780/kjrs.2018.34.5.1   DOI
21 Neto, J.C., 2004. A combined statistical-soft computing approach for classification and mapping weed species in minimum-tillage systems, Lincoln, NE, USA.
22 RDA (Rural Development Administration), 2018. Soybean, Rural Development Administration, Jeonju, Republic of Korea.
23 RDA (Rural Development Administration), 2022. The 2nd Rural Development Project Basic Plan (2018-2022), 2022 Implementation Plan, Rural Development Administration, Jeonju, Republic of Korea.
24 Seo, H. D. and E. M. Kim, 2019. Object Classification Using Point Cloud and True Ortho-image by Applying Random Forest and Support Vector Machine Techniques, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, 37(6): 405-416. https://doi.org/10.7848/ksgpc.2019.37.6.405   DOI
25 Sofia, V., R. Brian, R. Anca, and V.D.K. Esther, 2011. Confusion Matrix-based Feature Selection, Proc. of The 22nd Midwest Artificial Intelligence and Cognitive Science Conference, Cincinnati, OH, USA, Apr. 16-17, vol. 710, pp. 120-127.
26 Xiuliang, J., J. Pablo, T. Zarco, S. Urs, P.R. Matthew, J.H. Malcolm, K.V. Rajeev, Y. Tao, N. Chenwei, L. Zhenhai, M. Bo, X. Yonggui, X. Yongdun, and L. Shaokun, 2020. High-Throughput Estimation of Crop Traits, IEEE Geoscience and Remote Sensing Magazine, 9(1): 200-231. https://doi.org/10.1109/MGRS.2020.2998816   DOI
27 Swain, K.C., S.J. Thomson, and H. Jayasuriya, 2010. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop, Transactions of the ASABE, 53(1): 21-27. https://doi.org/10.13031/2013.29493   DOI
28 Tucker, C. J., 1979. Red and photographic infrared linear combinations for monitoring vegetation, Remote Sensing of Environment, 8(2): 127-150. https://doi.org/10.1016/0034-4257(79)90013-0   DOI
29 Woebbecke, D.M., G.E. Meyer, K. V. Bargen, and D.A. Mortensen, 1995. Color indices for weed identification under various soil, residue, and lighting conditions, Transactions of the ASAE, 38(1): 259-269. https://doi.org/10.13031/2013.27838   DOI
30 Zhangyan, J., R.H. Alfredo, D. Kamel, and M. Tomoaki, 2008. Development of a two-band enhanced vegetation index without a blue band, Remote Sensing of Environment, 112(10): 3833-3845. https://doi.org/10.1016/j.rse.2008.06.006   DOI
31 Gitelson, A.A., Y. Kaufman, R. Stark, and D. Rundquist, 2002, Novel algorithms for remote estimation of vegetation fraction, Remote Sensing of Environment, 80(1): 76-87. https://doi.org/10.1016/S0034-4257(01)00289-9   DOI