DOI QR코드

DOI QR Code

Estimation of fresh weight for chinese cabbage using the Kinect sensor

키넥트를 이용한 배추 생체중 추정

  • Lee, Sukin (Department of Plant Science, Seoul National University) ;
  • Kim, Kwang Soo (Department of Plant Science, Seoul National University)
  • 이석인 (서울대학교 식물생산과학부) ;
  • 김광수 (서울대학교 식물생산과학부)
  • Received : 2018.05.02
  • Accepted : 2018.06.21
  • Published : 2018.06.30

Abstract

Development and validation of crop models often require measurements of biomass for the crop of interest. Considerable efforts would be needed to obtain a reasonable amount of biomass data because the destructive sampling of a given crop is usually used. The Kinect sensor, which has a combination of image and depth sensors, can be used for estimating crop biomass without using destructive sampling approach. This approach could provide more data sets for model development and validation. The objective of this study was to examine the applicability of the Kinect sensor for estimation of chinese cabbage fresh weight. The fresh weight of five chinese cabbage was measured and compared with estimates using the Kinect sensor. The estimates were obtained by scanning individual chinese cabbage to create point cloud, removing noise, and building a three dimensional model with a set of free software. It was found that the 3D model created using the Kinect sensor explained about 98.7% of variation in fresh weight of chinese cabbage. Furthermore, the correlation coefficient between estimates and measurements were highly significant, which suggested that the Kinect sensor would be applicable to estimation of fresh weight for chinese cabbage. Our results demonstrated that a depth sensor allows for a non-destructive sampling approach, which enables to collect observation data for crop fresh weight over time. This would help development and validation of a crop model using a large number of reliable data sets, which merits further studies on application of various depth sensors to crop dry weight measurements.

작물 모델의 개발과 검증에 사용되는 생체중 자료는 파괴적 샘플링을 통해 얻어져 왔다. 파괴적 샘플링이 가지는 단점을 보완하기 위해 저가형 3D 센서인 Kinect 센서와 무료 공개 소프트웨어들을 사용하여 생체중을 추정하는 기법을 개발하였다. 특히, 많은 작물모델들이 개발되어 있지 않은 배추를 대상으로 입체이미지를 생성하여, 그로부터 얻어진 부피와 생체중 추정치의 신뢰도를 분석하고자 하였다. 크기가 다른 배추 결구 부위를 스캔하기 위해 Kinect 센서와, Microsoft가 무상으로 제공하는 Software Development Kit 내 Kinect Fusion Explorer 프로그램을 사용하였다. 개별 배추의 입체이미지를 생성하기 위해 3D 그래픽 편집 소프트웨어인 Meshlab을 활용하여 배경과 불필요한 물체를 수동으로 제거하였다. 또한, 불완전한 입체모델로부터 생체중 추정을 위해 3D 프린터 소프트웨어인 Makerbot Desktop 을 사용하여 배추를 생성하기 위해 필요한 플라스틱 필라멘트 소모량을 추정하였다. 입체모델 편집 프로그램인 Blender를 사용하여 부피를 추정하였을 때, 실제 부피에 비해 17.6%에서 2160.6% 범위의 상당한 오차가 있었다. 반면, 필라멘트 소요량은 실제 배추 생체중 변이의 98.7%를 설명할 수 있었다. 또한, 이들의 상관관계는 5% 수준에서 유의하였다. 이러한 결과들은 직접적인 부피 계산 절차를 제외하더라도 간편하게 생체중을 추정할 수 있음을 확인하였다. Kinect 센서를 사용하여 배추의 생체중 추정이 가능하다는 것이 확인 되었으나, 기존의 고가형 3D 센서에 비해 낮은 해상도와 주간에 활용이 어려운 점이 있다. 그럼에도 불구하고, 배추 생육 모델의 시계열적 검증 자료를 Kinect 센서를 이용하여 간편하고 신속하게 획득할 수 있어 모델의 불확도를 감소하는 데에 기여할 수 있을 것으로 판단된다. 따라서, 후속 연구에서 보다 저렴한 가격대의 3D 센서들을 대상으로 야외 및 주간조건애서 작물의 생체중 측정 가능성에 대해 검토하고 작물 모형 개발 및 개선을 위한 기술개발이 이루어져야 할 것으로 사료된다.

Keywords

References

  1. Andujar, D., C. Fernandez-Quintanilla, and J. Dorado, 2015: Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry. Sensors 15(6), 12999-13011. https://doi.org/10.3390/s150612999
  2. Andujar, D., A. Ribeiro, C. Fernandez-Quintanilla, and J. Dorado, 2016: Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Computers and Electronics in Agriculture 122, 67-73. https://doi.org/10.1016/j.compag.2016.01.018
  3. Araus, J. L., and J. E. Cairns, 2014: Field high-throughput phenotyping: the new crop breeding frontier. Trends in plant science 19(1), 52-61. https://doi.org/10.1016/j.tplants.2013.09.008
  4. Asteriadis, S., A. Chatzitofis, D. Zarpalas, D. S. Alexiadis, and P. Daras, 2013: Estimating human motion from multiple Kinect sensors. Proceedings of the 6th international conference on computer vision/computer graphics collaboration techniques and applications, Association for Computing Machinery, Berlin, 2013, 1-6.
  5. Azzari, G., M. L. Goulden, and R. B. Rusu, 2013: Rapid characterization of vegetation structure with a Microsoft Kinect sensor. Sensors 13(2), 2384-2398. https://doi.org/10.3390/s130202384
  6. Bendig, J., A. Bolten, S. Bennertz, J. Broscheit, S. Eichfuss, and G. Bareth, 2014: Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11), 10395-10412. https://doi.org/10.3390/rs61110395
  7. Berger, K., K. Ruhl, M. Albers, Y. Schröder, A. Scholz, J. Kokemüller, S. Guthe, and M. Magnor, 2011: The capturing of turbulent gas flows using multiple kinects. Proceedings of IEEE International Conference on Computer Vision Workshops, IEEE, Barcelona, 2011, 1108-1113.
  8. Busemeyer, L., A. Ruckelshausen, K. Möller, A. E. Melchinger, K. V. Alheit, H. P. Maurer, V. Hahn, E. A. Weissmann, J. C. Reif, and T. Würschum, 2013: Precision phenotyping of biomass accumulation in triticale reveals temporal genetic patterns of regulation. Scientific reports 3, 2442. https://doi.org/10.1038/srep02442
  9. Caruso, J. C., and N. Cliff, 1997: Empirical size, coverage, and power of confidence intervals for Spearman's rho. Educational and Psychological Measurement 57(4), 637-654. https://doi.org/10.1177/0013164497057004009
  10. de Wit, A. J. W., and C. A. van Diepen, 2007: Crop model data assimilation with the Ensemble Kalman filter for improving regional crop yield forecasts. Agricultural and Forest Meteorology 146(1-2), 38-56. https://doi.org/10.1016/j.agrformet.2007.05.004
  11. El-laithy, R. A., J. Huang, and M. Yeh, 2012: Study on the use of Microsoft Kinect for robotics applications. Proceedings of Position Location and Navigation Symposium, IEEE/ION, South Carolina, 2012, 1280-1288.
  12. Fang, H., W. Li, S. Wei, and C. Jiang, 2014: Seasonal variation of leaf area index (LAI) over paddy rice fields in NE China: Intercomparison of destructive sampling, LAI-2200, digital hemispherical photography (DHP), and AccuPAR methods. Agricultural and Forest Meteorology 198, 126-141.
  13. Flores, G., S. Zhou, R. Lozano, and P. Castillo, 2014: A vision and GPS-based real-time trajectory planning for a MAV in unknown and low-sunlight environments. Journal of Intelligent & Robotic Systems 74(1-2), 59-67.
  14. Karlberg, L., J. Rockström, J. G. Annandale, and J. M. Steyn, 2007: Low-cost drip irrigation-A suitable technology for southern Africa?: An example with tomatoes using saline irrigation water. Agricultural Water Management 89(1-2), 59-70. https://doi.org/10.1016/j.agwat.2006.12.011
  15. Keightley, K. E., and G. W. Bawden, 2010: 3D volumetric modeling of grapevine biomass using Tripod LiDAR. Computers and Electronics in Agriculture 74(2), 305-312. https://doi.org/10.1016/j.compag.2010.09.005
  16. Lee, J. H., H. J. Lee, S. K. Kim, S. G. Lee, H. S. Lee, and C. S. Choi, 2017: Development of growth models as affected by cultivation season and transplanting date and estimation of prediction yield in Kimchi Cabbage. Protected Horticulture and Plant Factory 26(4), 235-241. (in Korean with English abstract) https://doi.org/10.12791/KSBEC.2017.26.4.235
  17. Marinello, F., A. Pezzuolo, D. Cillis, and L. Sartori, 2016: Kinect 3d reconstruction for quantification of grape bunches volume and mass. Engineering for Rural Development 15, 876-881.
  18. Marinello, F., A. Pezzuolo, F. Meggio, J. A. Martínez-Casasnovas, T. Yezekyan, and L. Sartori, 2017: Application of the Kinect sensor for three dimensional characterization of vine canopy. Advances in Animal Biosciences 8(2), 525-529. https://doi.org/10.1017/S2040470017001042
  19. Mitchell, J. E., 1972: An analysis of the beta-attenuation technique for estimating standing crop of prairie range. Journal of Range Management, 300-304.
  20. Montes, J. M., F. Technow, B. S. Dhillon, F. Mauch, and A. E. Melchinger, 2011: High-throughput non-destructive biomass determination during early plant development in maize under field conditions. Field Crops Research 121(2), 268-273. https://doi.org/10.1016/j.fcr.2010.12.017
  21. Mutanga, O., and A. K. Skidmore, 2004: Narrow band vegetation indices overcome the saturation problem in biomass estimation. International Journal of Remote Sensing 25(19), 3999-4014. https://doi.org/10.1080/01431160310001654923
  22. Norgren, O., B. Elfving, and O. Olsson, 1995: Non‐destructive biomass estimation of tree seedlings using image analysis. Scandinavian Journal of Forest Research 10(1-4), 347-352. https://doi.org/10.1080/02827589509382901
  23. Paulus, S., J. Behmann, A. K. Mahlein, L. Plümer, and H. Kuhlmann, 2014: Low-cost 3D systems: suitable tools for plant phenotyping. Sensors 14(2), 3001-3018. https://doi.org/10.3390/s140203001
  24. Radovich, T. J. K., and M. D. Kleinhenz, 2004: Rapid estimation of cabbage head volume across a population varying in head shape: A test of two geometric formulae. HortTechnology 14(3), 388-391.
  25. Simko, I., R. J. Hayes, and R. T. Furbank, 2016: Non-destructive Phenotyping of Lettuce Plants in Early Stages of Development with Optical Sensors. Frontiers in plant science 7, 1985.
  26. Smisek, J., M. Jancosek, and T. Pajdla, 2013: 3D with Kinect. In Consumer depth cameras for computer vision, Springer, London, 3-25.
  27. Steduto, P., T. C. Hsiao, D. Raes, and E. Fereres, 2009: AquaCrop-The FAO crop model to simulate yield response to water: I. Concepts and underlying principles. Agronomy Journal 101(3), 426-437. https://doi.org/10.2134/agronj2008.0139s
  28. Stockle, C. O., S. A. Martin, and G. S. Campbell, 1994: CropSyst, a cropping systems simulation model: water/nitrogen budgets and crop yield. Agricultural Systems 46(3), 335-359. https://doi.org/10.1016/0308-521X(94)90006-2
  29. Teare, I. D., G. O. Mott, and J. R. Eaton, 1966: Beta attenuation-a technique for estimating forage yield in situ. Radiation Botany 6(1), 7-11. https://doi.org/10.1016/S0033-7560(66)80088-1
  30. Tucker, C. J., 1980: A critical review of remote sensing and other methods for non‐destructive estimation of standing crop biomass. Grass and Forage Science 35(3), 177-182. https://doi.org/10.1111/j.1365-2494.1980.tb01509.x
  31. Walter, J., J. Edwards, G. McDonald, and H. Kuchel, 2018: Photogrammetry for the estimation of wheat biomass and harvest index. Field Crops Research 216, 165-174. https://doi.org/10.1016/j.fcr.2017.11.024
  32. Williams, J. R., C. A. Jones, J. R. Kiniry, and D. A. Spanel, 1989: The EPIC crop growth model. Transactions of the ASAE 32(2), 497-0511. https://doi.org/10.13031/2013.31032
  33. Williams, J. H., R. C. N. Rao, F. Dougbedji, and H. S. Talwar, 1996: Radiation interception and modelling as an alternative to destructive samples in crop growth measurements. Annals of applied biology 129(1), 151-160. https://doi.org/10.1111/j.1744-7348.1996.tb05739.x
  34. Xia, C., L. Wang, B. K. Chung, and J. M. Lee, 2015: In situ 3D segmentation of individual plant leaves using a RGB-D camera for agricultural automation. Sensors 15(8), 20463-20479. https://doi.org/10.3390/s150820463
  35. Yang, H. S., A. Dobermann, J. L. Lindquist, D. T. Walters, T. J. Arkebauer, and K. G. Cassman, 2004: Hybrid-maize-a maize simulation model that combines two crop modeling approaches. Field Crops Research 87(2-3), 131-154. https://doi.org/10.1016/j.fcr.2003.10.003
  36. Zennaro, S., M. Munaro, S. Milani, P. Zanuttigh, A. Bernardi, S. Ghidoni, and E. Menegatti, 2015: Performance evaluation of the 1st and 2nd generation Kinect for multimedia applications. Proceedings of IEEE Conference on Multimedia and Expo, IEEE, Turin, 2015, 1-6.
  37. Zou, K. H., K. Tuncali, and S. G. Silverman, 2003: Correlation and simple linear regression. Radiology 227(3), 617-628. https://doi.org/10.1148/radiol.2273011499