DOI QR코드

DOI QR Code

Comparison of Convolutional Neural Network (CNN) Models for Lettuce Leaf Width and Length Prediction

상추잎 너비와 길이 예측을 위한 합성곱 신경망 모델 비교

  • Ji Su Song (Department of Bio-Industrial Machinery Engineering, Pusan National University) ;
  • Dong Suk Kim (Department of Bio-Industrial Machinery Engineering, Pusan National University) ;
  • Hyo Sung Kim (Department of Bio-Industrial Machinery Engineering, Pusan National University) ;
  • Eun Ji Jung (Department of Bio-Industrial Machinery Engineering, Pusan National University) ;
  • Hyun Jung Hwang (Department of Bio-Industrial Machinery Engineering, Pusan National University) ;
  • Jaesung Park (Department of Bio-Industrial Machinery Engineering, Pusan National University)
  • 송지수 (부산대학교 바이오산업기계공학과) ;
  • 김동석 (부산대학교 바이오산업기계공학과) ;
  • 김효성 (부산대학교 바이오산업기계공학과) ;
  • 정은지 (부산대학교 바이오산업기계공학과) ;
  • 황현정 (부산대학교 바이오산업기계공학과) ;
  • 박재성 (부산대학교 바이오산업기계공학과)
  • Received : 2023.08.31
  • Accepted : 2023.10.26
  • Published : 2023.10.31

Abstract

Determining the size or area of a plant's leaves is an important factor in predicting plant growth and improving the productivity of indoor farms. In this study, we developed a convolutional neural network (CNN)-based model to accurately predict the length and width of lettuce leaves using photographs of the leaves. A callback function was applied to overcome data limitations and overfitting problems, and K-fold cross-validation was used to improve the generalization ability of the model. In addition, ImageDataGenerator function was used to increase the diversity of training data through data augmentation. To compare model performance, we evaluated pre-trained models such as VGG16, Resnet152, and NASNetMobile. As a result, NASNetMobile showed the highest performance, especially in width prediction, with an R_squared value of 0.9436, and RMSE of 0.5659. In length prediction, the R_squared value was 0.9537, and RMSE of 0.8713. The optimized model adopted the NASNetMobile architecture, the RMSprop optimization tool, the MSE loss functions, and the ELU activation functions. The training time of the model averaged 73 minutes per Epoch, and it took the model an average of 0.29 seconds to process a single lettuce leaf photo. In this study, we developed a CNN-based model to predict the leaf length and leaf width of plants in indoor farms, which is expected to enable rapid and accurate assessment of plant growth status by simply taking images. It is also expected to contribute to increasing the productivity and resource efficiency of farms by taking appropriate agricultural measures such as adjusting nutrient solution in real time.

식물의 잎의 크기나 면적을 아는 것은 생장을 예측하고 실내 농장의 생산성의 향상에 중요한 요소이다. 본 연구에서는 상추 잎 사진을 이용해 엽장과 엽폭을 예측할 수 있는 CNN기반 모델을 연구하였다. 데이터의 한계와 과적합 문제를 극복하기 위해 콜백 함수를 적용하고, 모델의 일반화 능력을 향상시키기 위해 K겹교차 검증을 사용했다. 또한 데이터 증강을 통한 학습데이터의 다양성을 높이기 위해 image generator를 사용하였다. 모델 성능을 비교하기 위해 VGG16, Resnet152, NASNetMobile 등 사전학습된 모델을 이용하였다. 그 결과 너비 예측에서 R2 값0.9436, RMSE 0.5659를 기록한 NASNetMobile이 가장 높은 성능을 보였으며 길이 예측에서는 R2 값이 0.9537, RMSE가 0.8713로 나타났다. 최종 모델에는 NASNetMobile 아키텍처, RMSprop 옵티마이저, MSE 손실 함수, ELU 활성화함수가 사용되었다. 모델의 학습 시간은 Epoch당평균73분이 소요되었으며, 상추 잎 사진 한 장을 처리하는 데 평균0.29초가 걸렸다. 본 연구는 실내 농장에서 식물의 엽장과 엽폭을 예측하는 CNN 기반 모델을 개발하였고 이를 통해 단순한 이미지 촬영만으로도 식물의 생장 상태를 신속하고 정확하게 평가할 수 있을 것으로 기대된다. 또한 그 결과는 실시간 양액 조절 등의 적절한 농작업 조치를 하는데 활용됨으로써 농장의 생산성 향상과 자원 효율성을 향상시키는데 기여할 것이다.

Keywords

Acknowledgement

본 연구는 2021년도 정부(교육부)의 재원으로 한국연구재단의 창의도전연구기반지원사업의 지원(No. 2021R1I1A1A01058373)과 2022학년도 부산대학교 신임교수연구정착금 지원으로 이루어졌음.

References

  1. Boyaci S., and H. Kucukonder 2022, A research on nondestructive leaf area estimation modeling for some apple cultivars. Erwerbs-Obstbau 64:1-7. doi:10.1007/s10341-021-00619-w
  2. Commercialization Promotion Agency for R&D Outcome (COMPA) 2019, S&T Market Report, Vol. 69. COMPA, Seoul, Korea.
  3. De Lucena L.R.R., M.L.D.M.V. Leite, C.B. da Cruz Junior, J.D. Carvalho, E.R. dos Santos, and A.D.M. de Oliveira 2019, Estimation of cladode area of Nopalea cochenillifera using digital images. J Prof Assoc Cactus Dev 21:32-42. doi:10.56890/jpacd.v21i.4
  4. Deng Y., K. Yu, X. Yao, Q. Xie, Y. Hsieh, and J. Liu 2019, Estimation of Pinus massoniana leaf area using terrestrial laser scanning. Forests 10:660. doi:10.3390/f10080660
  5. Fakir M.S.A., M.A.B. Siddique, A. Islam, M.R. Ismail, and M.K. Uddin 2013, Leaf area estimation by linear regression models in pigeonpea (Cajanus cajan (L.) Millsp.). J Food Agric Environ 11:312-316.
  6. Gang M.S., H.J. Kim, and D.W. Kim 2022, Estimation of greenhouse lettuce growth indices based on a two-stage CNN using RGB-D images. Sensors 22:5499. (in Korean) doi:10.3390/s22155499
  7. Hajjdiab H., and A. Obaid 2010, A vision-based approach for nondestructive leaf area estimation. In 2010 The 2nd Conference on Environmental Science and Information Application Technology IEEE, pp 53-56. doi:10.1109/ESIAT.2010.5568973
  8. He K., X. Zhang, S. Ren, and J. Sun 2016, Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770-778. doi:10.1109/CVPR.2016.90
  9. Kim S.K., Lee S.K., Lee H.J., and Lee J.K. 2017, Horticultural crop growth models for smart farms: utilization of descriptive, explanatory, and structural growth models. Magazine Korean Soc Agric Engin 59:28-37. (in Korean)
  10. Korea Rural Economic Institute (KREI) 2006, Agriculture and Rural Economy Trends Spring 2006. KREI, Naju, Korea. (in Korean)
  11. Korea Rural Economic Institute (KREI) 2016, A study on analyzing the realities of smart farm operations and researching development direction. KREI, Naju, Korea. (in Korean)
  12. Launay M., and M. Guerif 2003, Ability for a model to predict crop production variability at the regional scale: an evaluation for sugar beet. Agronomie 23:135-146. doi:10.1051/agro:2002078
  13. Mack L., F. Capezzone, S. Munz, H.P. Piepho, W. Claupein, T. Phillips, and S. Graeff-Honninger 2017, Nondestructive leaf area estimation for chia. Agron J 109:1960-1969. doi:10.2134/agronj2017.03.0149
  14. Nasiri A., A. Taheri-Garavand, D. Fanourakis, Y.D. Zhang, and N. Nikoloudakis 2021, Automated grapevine cultivar identification via leaf imaging and deep convolutional neural networks: a proof-of-concept study employing primary iranian varieties. Plants 10:1628. doi:10.3390/plants10081628
  15. National Assembly Budget Office (NABO) 2022, Current status and improvement tasks of the smart agriculture fostering project. NABO, Seoul, Korea. (in Korean)
  16. National Information Society Agency (NIA) 2019, AI INsight Report Vol. 01. NIA, Daegu, Korea. (in Korean)
  17. NICE Information Service Co., Ltd. (NICE) 2020, GREENPLUS Technical Analysis Report. NICE, Seoul, Korea. (in Korean)
  18. Peksen E. 2007, Non-destructive leaf area estimation model for faba bean (Vicia faba L.). Sci Hortic 113:322-328. doi:10.1016/j.scienta.2007.04.003
  19. Simonyan K., and A. Zisserman 2014, Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
  20. Souza M.C., and G. Habermann 2014, Non-destructive equations to estimate the leaf area of Styrax pohlii and Styrax ferrugineus. Braz J Biol 74:222-225. doi:10.1590/1519-6984.17012
  21. Zhang L., Z. Xu, D. Xu, J. Ma, Y. Chen, and Z. Fu 2020, Growth monitoring of greenhouse lettuce based on a convolutional neural network. Hortic Res 7:124. doi:10.1038/s41438-020-00345-6
  22. Zhang W. 2020, Digital image processing method for estimating leaf length and width tested using kiwifruit leaves (Actinidia chinensis Planch). PloS ONE 15:e0235499. doi:10.1371/journal.pone.0235499
  23. Zoph B., and Q.V. Le 2016, Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578.
  24. Zoph B., V. Vasudevan, J. Shlens, and Q.V. Le 2018, Learning transferable architectures for scalable image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 8697-8710. doi:10.1109/CVPR.2018.00907