DOI QR코드

DOI QR Code

Strawberry Pests and Diseases Detection Technique Optimized for Symptoms Using Deep Learning Algorithm

딥러닝을 이용한 병징에 최적화된 딸기 병충해 검출 기법

  • Choi, Young-Woo (Department of Smartfarm, Graduate School of Gyeongsang National University) ;
  • Kim, Na-eun (Department of Bio-Systems Engineering, Graduate School of Gyeongsang National University) ;
  • Paudel, Bhola (Department of Bio-Systems Engineering, Graduate School of Gyeongsang National University) ;
  • Kim, Hyeon-tae (Department of Bio-Industrial Machinery Engineering, Gyeongsang National University (Institute of Smart Farm))
  • 최영우 (경상국립대학교 대학원 스마트팜학과) ;
  • 김나은 (경상국립대학교 대학원 바이오시스템공학과) ;
  • 볼라파우델 (경상국립대학교 대학원 바이오시스템공학과) ;
  • 김현태 (경상국립대학교 생물산업기계공학과(스마트팜연구소))
  • Received : 2022.04.19
  • Accepted : 2022.07.27
  • Published : 2022.07.31

Abstract

This study aimed to develop a service model that uses a deep learning algorithm for detecting diseases and pests in strawberries through image data. In addition, the pest detection performance of deep learning models was further improved by proposing segmented image data sets specialized in disease and pest symptoms. The CNN-based YOLO deep learning model was selected to enhance the existing R-CNN-based model's slow learning speed and inference speed. A general image data set and a proposed segmented image dataset was prepared to train the pest and disease detection model. When the deep learning model was trained with the general training data set, the pest detection rate was 81.35%, and the pest detection reliability was 73.35%. On the other hand, when the deep learning model was trained with the segmented image dataset, the pest detection rate increased to 91.93%, and detection reliability was increased to 83.41%. This study concludes with the possibility of improving the performance of the deep learning model by using a segmented image dataset instead of a general image dataset.

본 논문은 딥러닝 알고리즘을 이용하여 딸기 영상 데이터의 병충해 존재 여부를 자동으로 검출할 수 있는 서비스 모델을 제안한다. 또한 병징에 특화된 분할 이미지 데이터 세트를 제안하여 딥러닝 모델의 병충해 검출 성능을 향상한다. 딥러닝 모델은 CNN 기반 YOLO를 선정하여 기존의 R-CNN 기반 모델의 느린 학습속도와 추론속도를 개선하였다. 병충해 검출 모델을 학습하기 위해 일반적인 데이터 세트와 제안하는 분할 이미지 데이터 세트를 구축하였다. 딥러닝 모델이 일반적인 학습 데이터 세트를 학습했을 때 병충해 검출률은 81.35%이며 병충해 검출 신뢰도는 73.35%이다. 반면 딥러닝 모델이 분할 이미지 학습 데이터 세트를 학습했을 때 병충해 검출률은 91.93%이며 병충해 검출 신뢰도는 83.41%이다. 따라서 분할 이미지 데이터를 학습한 딥러닝 모델의 성능이 우수하다는 것을 증명할 수 있었다.

Keywords

Acknowledgement

본 결과물은 농림축산식품부의 재원으로 농림식품기술기획평가원의 농식품기술융합창의인재양성사업의 지원을 받아 연구되었음(717001-7).

References

  1. Dumitru E., S. Christian, T. Alexander, and A. Dragomir 2014, Scalable object detection using deep neural networks. CVPR '14: Proceedings of the 2014 IEEE. pp 2155-2162. doi:10.1109/CVPR.2014.276
  2. Kim S.J., and H.S. Kim 2020, Multi-tasking U-net based paprika disease diagnosis. Smart Media J 9:16-22. (in Korean) doi:10.30693/SMJ.2020.9.1.16
  3. Kim S.K., and J.G. Ahn 2021, Tomato crop diseases classification models using deep CNN-based architectures. J Korea Acad-Ind Coop Soc 22:7-14. (in Korean) doi:10.5762/KAIS.2021.22.5.7
  4. Lecun Y., L. Bottou, Y. Bengio, and P. Haffner 1998, Gradient-based learning applied to document recognition. Proceedings of the IEEE 86:2278-2324. doi:10.1109/5.726791
  5. Nam M.H., H.S. Kim, T.I. Kim, and E.M. Lee 2015, Comparison of environmental-friendly and chemical spray calendar for controlling diseases and insect pests of strawberry during nursery seasons. Res Plant Dis 21:273-279. (in Korean) doi:10.5423/RPD.2015.21.4.273
  6. Olson D.L., and D. Delen 2008, Advanced data mining techniques. Springer, Berlin, Germany, pp 1-180.
  7. Redmon J., and A. Farhadi 2017, YOLO9000: Better, faster, stronger. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp 6517-6525. doi:10.1109/CVPR.2017.690
  8. Ren S., K. He, R. Girshick, and J. Sun 2017, Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell 39:1137-1149. doi:10.1109/TPAMI.2016.2577031
  9. Shorten C., and T.M. Khoshgoftaar 2019, A survey on image data augmentation for deep learning. J Big Data 6:60. doi:10.1186/s40537-019-0197-0.