DOI QR코드

DOI QR Code

Towards Improved Performance on Plant Disease Recognition with Symptoms Specific Annotation

  • Dong, Jiuqing (Department of Electronics Engineering, Jeonbuk National University) ;
  • Fuentes, Alvaro (Department of Electronics Engineering, Jeonbuk National University) ;
  • Yoon, Sook (Department of Computer Engineering, Mokpo National University) ;
  • Kim, Taehyun (National Institute of Agricultural Sciences) ;
  • Park, Dong Sun (Department of Electronics Engineering, Jeonbuk National University)
  • 투고 : 2022.05.13
  • 심사 : 2022.06.02
  • 발행 : 2022.05.31

초록

Object detection models have become the current tool of choice for plant disease detection in precision agriculture. Most existing research improves the performance by ameliorating networks and optimizing the loss function. However, the data-centric part of a whole project also needs more investigation. In this paper, we proposed a systematic strategy with three different annotation methods for plant disease detection: local, semi-global, and global label. Experimental results on our paprika disease dataset show that a single class annotation with semi-global boxes may improve accuracy. In addition, we also studied the noise factor during the labeling process. An ablation study shows that annotation noise within 10% is acceptable for keeping good performance. Overall, this data-centric numerical analysis helps us to understand the significance of annotation methods, which provides practitioners a way to obtain higher performance and reduce annotation costs on plant disease detection tasks. Our work encourages researchers to pay more attention to label quality and the essential issues of labeling methods.

키워드

과제정보

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (No. 2019R1A6A1A09031717); by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, and Forestry (IPET) and Korea Smart Farm R&D Foundation (KosFarm) through Smart Farm Innovation Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA) and Ministry of Science and ICT(MSIT), Rural Development Administration (RDA) (421005-04); and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT). (NRF-2021R1A2C1012174).

참고문헌

  1. Saleem M H, Potgieter J, Arif K M. "Plant disease detection and classification by deep learning." Plants, Vol. 8, No. 11, pp.468, Oct. 2019. https://doi.org/10.3390/plants8110468
  2. Vishnoi V K, Kumar K, Kumar B. "Plant disease detection using computational intelligence and image processing," Journal of Plant Diseases and Protection, Vol. 128, No. 1, pp. 19-53, Aug. 2021. https://doi.org/10.1007/s41348-020-00368-0
  3. Ferentinos K P. "Deep learning models for plant disease detection and diagnosis," Computers and electronics in agriculture, Vol. 145, pp. 311-318, Feb. 2018. https://doi.org/10.1016/j.compag.2018.01.009
  4. Fuentes A, Yoon S, Kim S C, et al. "A robust deeplearning-based detector for real-time tomato plant diseases and pests recognition," Sensors, Vol. 17, No. 9, Sep. 2017.
  5. Fuentes A, Yoon S, Kim T, et al. "Open Set Self and Across Domain Adaptation for Tomato Disease Recognition With Deep Learning Techniques," Frontiers in plant science, Dec. 2021.
  6. Fuentes A F, Yoon S, Lee J, et al., "High-performance deep neural network-based tomato plant diseases and pests diagnosis system with refinement filter bank," Frontiers in plant science, Aug. 2018.
  7. Nazki H, Yoon S, Fuentes A, et al,. "Unsupervised image translation using adversarial networks for improved plant disease recognition," Computers and Electronics in Agriculture, Vol. 168, Jan. 2020.
  8. Gao, F., Fu, L., Zhang, X., Majeed, Y., Li, R., Karkee, M., et al., "Multiclass fruit-on-plant detection for apple in SNAP system using Faster R-CNN," Computers and Electronics in Agriculture, Vol. 176, Sep. 2020.
  9. Schauberger, B., J.Jagermeyr, J., and Gornott, C., "A systematic review of local to regional yield forecasting approaches and frequently used data resources," European Journal of Agronomy, Vol. 120, Oct. 2020.
  10. Emmi, L., Le Flecher, E., Cadenat. V., and Devy, M., "A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture." Precision Agriculture, Vol. 22, pp. 524-549, Jan. 2021. https://doi.org/10.1007/s11119-020-09773-9
  11. Chen, Y., Zhang, B., Zhou, J., and Wang, K., "Real-time 3D unstructured environment reconstruction utilizing VR and Kinect-based immersive teleoperation for agricultural field robots," Computers and Electronics in Agriculture, Vol. 175, Aug. 2020.
  12. Li Y, Chao X. "Toward sustainability: trade-off between data quality and quantity in crop pest recognition," Frontiers in plant science, Dec. 2021.
  13. Tan M, Le Q. "Efficientnet: Rethinking model scaling for convolutional neural networks," Proceedings of the 36th International Conference on Machine Learning, PMLR, pp. 6105-6114, 2019.
  14. Tan M, Le Q. "Efficientnetv2: Smaller models and faster training," Proceedings of the 38th International Conference on Machine Learning, PMLR, pp. 10096-10106, 2021
  15. Liu Z, Hu H, Lin Y, et al., "Swin Transformer V2: Scaling Up Capacity and Resolution," arXiv preprint arXiv:2111.09883, 2021.
  16. Wang X A, Tang J, Whitty M. "Data-centric analysis of on-tree fruit detection: Experiments with deep learning," Computers and Electronics in Agriculture, Vol. 194, Mar. 2022.
  17. Liu X, Wang H, Zhang Y, et al. "Towards Efficient Data-Centric Robust Machine Learning with Noise-based Augmentation," arXiv preprint arXiv:2203.03810, 2022.