References
- Barrero, O. and Perdomo, S.A. (2018), RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields, Precision Agriculture, Vol. 19, No. 5, pp. 809-822. https://doi.org/10.1007/s11119-017-9558-x
- Chew, R., Rineer, J., Beach, R., O'Neil, M., Ujeneza, N., Lapidus, D., Miano, T., Hegarty-Craver, M., Polly, J., and Temple, D. S. (2020), Deep Neural Networks and Transfer Learning for Food Crop Identification in UAV Images. Drones, Vol. 4, No. 1, 7p. https://doi.org/10.3390/drones4010007
- Choi, S.K., Lee, S.K., Kang, Y.B., Seong, S.K., Choi, D.Y., and Kim, G.H. (2020), Applicability of Image Classification Using Deep Learning in Small Area: Case of Agricultural Lands Using UAV Image. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 38, No. 1, pp. 23-33. https://doi.org/10.7848/KSGPC.2020.38.1.23
- Gallego, J., Kravchenko, A.N., Kussul, N.N., Skakun, S.V., Shelestov, A.Y., and Grypych, Y.A. (2012), Efficiency assessment of different approaches to crop classification based on satellite and ground observations, Journal of Automation and Information Sciences, Vol. 44, No. 5, pp. 67-80. https://doi.org/10.1615/JAutomatInfScien.v44.i5.70
- Gamon, J.A. and Surfus, J.S. (1999), Assessing leaf pigment content and activity with a reflectometer. The New Phytologist, Vol. 143, No. 1, pp. 105-117. https://doi.org/10.1046/j.1469-8137.1999.00424.x
- Ghosh, A., Ehrlich, M., Shah, S., Davis, L.S., and Chellappa, R. (2018), Stacked U-Nets for Ground Material Segmentation in Remote Sensing Imagery, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 18-22 June, Salt Lake City, United States, pp. 257-261.
- Gu, Z., Cheng, J., Fu, H., Zhou, K., Hao, H., Zhao, Y., Zhang, T., Gao, S., and Liu, J. (2019), Ce-net: Context encoder network for 2d medical image segmentation. IEEE transactions on medical imaging, Vol. 38, NO. 10, pp. 2281-2292. https://doi.org/10.1109/TMI.2019.2903562
- Huang, H., Lan, Y., Yang, A., Zhang, Y., Wen, S., and Deng, J. (2020), Deep learning versus Object-based Image Analysis (OBIA) in weed mapping of UAV imagery. International Journal of Remote Sensing, Vol. 41, No. 9, pp. 3446-3479. https://doi.org/10.1080/01431161.2019.1706112
- Hunt, E.R., Cavigelli, M., Daughtry, C.S., Mcmurtrey, J.E., and Walthall, C.L. (2005), Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precision Agriculture, Vol. 6, No. 4, pp. 359-378. https://doi.org/10.1007/s11119-005-2324-5
- Hutt, C., Koppe, W., Miao, Y., and Bareth, G. (2016), Best accuracy land use/land cover (LULC) classification to derive crop types using multitemporal, multisensor, and multi-polarization SAR satellite images, Remote sensing, Vol. 8, No. 8, 684p. https://doi.org/10.3390/rs8080684
- Jaeger, P.F., Kohl, S.A., Bickelhaupt, S., Isensee, F., Kuder, T. A., Schlemmer, H.P., and Maier-Hein, K.H. (2020), Retina U-Net: Embarrassingly simple exploitation of segmentation supervision for medical object detection. Machine Learning for Health Workshop, PMLR, 11 December, pp. 171-183.
- Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., and Stachniss, C. (2017), UAV-based crop and weed classification for smart farming, 2017 IEEE International Conference on Robotics and Automation (ICRA), 29 May-3 June, Singapore, pp. 3024-3031.
- MAFRA. (2016), https://www.mafra.go.kr/mafra/293/subview.do?enc=Zm5jdDF8QEB8JTJGYmJzJTJGbWFmcmElMkY2OCUyRjMxMzA0NyUyRmFydGNsVmlldy5kbyUzRg%3D%3D (last date accessed: 27 August 2020).
- Meyer, G.E., Neto, J.C., Jones, D.D., and Hindman, T.W. (2004), Intensified fuzzy clusters for classifying plant, soil, and residue regions of interest from color images. Computers and electronics in agriculture, Vol. 42, No. 3, pp. 161-180. https://doi.org/10.1016/j.compag.2003.08.002
- NCIS. (2020), http://www.nics.go.kr/oneStopIndex/index.do (last date accessed: 27 August 2020).
- Neto, J.C. (2004), A combined statistical-soft computing approach for classification and mapping weed species in minimum-tillage systems, ProQuest, Michigan, U.S.
- Ronneberger, O., Fischer, P., and Brox, T. (2015), U-net: Convolutional networks for biomedical image segmentation, International Conference on Medical image computing and computer-assisted intervention, 5-9 October, Munich, Germany, pp. 234-241.
- RuBwurm, M. and Korner, M. (2017), Temporal vegetation modelling using long short-term memory networks for crop identification from medium-resolution multi-spectral satellite images, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 11-19.
- Sishodia, R.P., Ray, R.L., and Singh, S.K. (2020), Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sensing, Vol. 12, No. 19, 3136p. https://doi.org/10.3390/rs12193136
- Woebbecke, D.M., Meyer, G.E., Von Bargen, K., and Mortensen, D.A. (1995), Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE, Vol. 38, No. 1, pp. 259-269. https://doi.org/10.13031/2013.27838
- Xiaoqin, W., Miaomiao, W., Shaoqiang, W., and Yundong, W. (2015), Extraction of vegetation information from visible unmanned aerial vehicle images. Transactions of the Chinese Society of Agricultural Engineering, Vol. 31, No. 5.
Cited by
- 유·무인 항공영상을 이용한 심층학습 기반 녹피율 산정 vol.37, pp.6, 2020, https://doi.org/10.7780/kjrs.2021.37.6.1.22