DOI QR코드

DOI QR Code

Key-point detection of fruit for automatic harvesting of oriental melon

참외 자동 수확을 위한 과일 주요 지점 검출

  • Seung-Woo Kang (Department of Biosystem Machinery Engineering, Chungnam National University) ;
  • Jung-Hoon Yun (SolarPos Inc.) ;
  • Yong-Sik Jeong (Okcheon-gun Agricultural Technology Center) ;
  • Kyung-Chul Kim (Department of Agricultural Engineering, National Institute of Agricultural Science) ;
  • Dae-Hyun Lee (Department of Biosystem Machinery Engineering, Chungnam National University)
  • Received : 2024.05.10
  • Accepted : 2024.05.27
  • Published : 2024.06.01

Abstract

In this study, we suggested a key-point detection method for robot harvesting of oriental melon. Our suggested method could be used to detect the detachment part and major composition of oriental melon. We defined four points (harvesting point, calyx, center, bottom) based on tomato with characteristics similar to those of oriental melon. The evaluation of estimated key-points was conducted by pixel error and PDK (percentage of detected key-point) index. Results showed that the average pixel error was 18.26 ± 16.62 for the x coordinate and 17.74 ± 18.07 for the y coordinate. Considering the resolution of raw images, these pixel errors were not expected to have a serious impact. The PDK score was found to be 89.5% PDK@0.5 on average. It was possible to estimate oriental melon specific key-point. As a result of this research, we believe that the proposed method can contribute to the application of harvesting robot system.

Keywords

Acknowledgement

본 결과물은 농림축산식품부 및 과학기술정보통신부, 농촌진흥청의 재원으로 농림식품기술기획평가원과 재단법인 스마트팜연구개발사업단의 스마트팜다부처패키지혁신기술개발사업의 지원을 받아 연구되었음(421031-04)

References

  1. J. Won et al., "Study on Traveling Characteristics of Straight Automatic Steering Devices for Drivable Agricultural Machinery," Journal of Drive and Control, vol. 19, no. 4, pp. 19-28, 2022. 
  2. B. Seong et al., "Predicting the spray uniformity of pest control drone using multi-layer perceptron," Journal of Drive and Control, vol. 20, no. 3, pp. 25-34, 2023. 
  3. T. Kim, et. al., "Estimation of tomato maturity as a continuous index using deep neural networks", Korean Journal of Agricultural Science, Vol.49, No.4, pp.785-793, 2022. 
  4. S. W. Kang, et. al., "Localization of ripe tomato bunch using deep neural networks and class activation mapping", Korean Journal of Agricultural Science, Vol.50, No.3, pp.357-364, 2023. 
  5. K. S. Kim, J. I. Lee, S. W. Gwak, W. Y. Kang, D. Y. Shin, and S. H. Hwang, "Construction of Database for Deep Learning-based Occlusion Area Detection in the Virtual Environment," Journal of Drive and Control, vol. 19, no. 3, pp. 9-15, 2022. 
  6. A. Kuznetsova, T. Maleva and V. Soloviev, "Using YOLOv3 algorithm with pre-and post-processing for apple detection in fruit-harvesting robot", Agronomy, Vol.10, No.7, pp.1016, 2020. 
  7. W. Chen, et. al., "An apple detection method based on des-YOLO v4 algorithm for harvesting robots in complex environment", Mathematical Problems in Engineering, pp.1-12, 2021. 
  8. L. Fu, et. al., "Faster R-CNN-based apple detection in dense-foliage fruiting-wall trees using RGB and depth features for robotic harvesting", Biosystems Engineering, Vol.197, pp.245-256, 2020. 
  9. Y. Yu, K. Zhang, L. Yang and D. Zhang, "Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN", Computers and Electronics in Agriculture, Vol.163, pp.104846, 2019. 
  10. Y. Xiong, et. al., "An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation", Journal of Field Robotics, Vol.37, No.2, pp.202-224, 2020. 
  11. H. Hu, et. al., "Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN", International Journal of Agricultural and Biological Engineering, Vol.15, No.6, pp.175-179, 2022. 
  12. M. O. Lawal, "Tomato detection based on modified YOLOv3 framework. Scientific Reports", Vol.11, No.1, pp.1-11, 2021. 
  13. A. Toshev and C. Szegedy, "Deeppose: Human pose estimation via deep neural networks", In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.1653-1660, 2014. 
  14. Z. Cao, et. al., "Realtime multi-person 2d pose estimation using part affinity fields", In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 7291-7299, 2017. 
  15. X. Li, et. al., "Deep cascaded convolutional models for cattle pose estimation", Computers and Electronics in Agriculture, Vol.164, pp.104885, 2019. 
  16. T. Kim, et. al., "2D pose estimation of multiple tomato fruit-bearing systems for robotic harvesting", Computers and Electronics in Agriculture, Vol.211, pp.108004, 2023. 
  17. Z. Wu, et. al., "A method for identifying grape stems using keypoints", Computers and Electronics in Agriculture, Vol.209, pp.107825, 2023. 
  18. B. C. Russell, et. al., "LabelMe: a database and web-based tool for image annotation", International journal of computer vision, Vol.77, pp.157-173, 2008. 
  19. O. Ronneberger, P. Fischer and T. Brox, "U-net: Convolutional networks for biomedical image segmentation", In Medical image computing and computer-assisted intervention-MICCAI 2015: 18th international conference, pp.234-241, 2015. 
  20. W. S. Kim, et. al., "Weakly supervised crop are a segmentation for an autonomous combine harvester", Sensors, Vol.21, No.14, pp.4801, 2021. 
  21. S. Cho, et. al., "Estimation of two-dimensional position of soybean crop for developing weeding robot", Journal of Drive and Control, Vol.20, No. 2, pp.15-23, 2023. 
  22. S. E. Wei, et. al., "Convolutional pose machines", In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp. 4724-4732, 2016.