DOI QR코드

DOI QR Code

Estimation of the Dimensions of Horticultural Products and the Mean Plant Height of Plug Seedlings Using Three-Dimensional Images

3차원 영상을 이용한 원예산물의 크기와 플러그묘의 평균초장 추정

  • Jang, Dong Hwa (Dept. of Agricultural Machinery Engineering, Graduate School, Jeonbuk National University) ;
  • Kim, Hyeon Tae (Dept. of Biosystems Engineering, College of Agriculture & Life Sciences, Gyeongsang National University (Institute of Agriculture & Life Science)) ;
  • Kim, Yong Hyeon (Dept. of Bioindustrial Machinery Engineering, College of Agriculture & Life Sciences, Jeonbuk National University)
  • 장동화 (전북대학교 대학원 농업기계공학과) ;
  • 김현태 (경상대학교 농업생명과학대학 바이오시스템공학과) ;
  • 김용현 (전북대학교 농업생명과학대학 생물산업기계공학과)
  • Received : 2019.08.22
  • Accepted : 2019.10.01
  • Published : 2019.10.30

Abstract

This study was conducted to estimate the dimensions of horticultural products and the mean plant height of plug seedlings using three-dimensional (3D) images. Two types of camera, a ToF camera and a stereo-vision camera, were used to acquire 3D images for horticultural products and plug seedlings. The errors calculated from the ToF images for dimensions of horticultural products and mean height of plug seedlings were lower than those predicted from stereo-vision images. A new indicator was defined for determining the mean plant height of plug seedlings. Except for watermelon with tap, the errors of circumference and height of horticultural products were 0.0-3.0% and 0.0-4.7%, respectively. Also, the error of mean plant height for plug seedlings was 0.0-5.5%. The results revealed that 3D images can be utilized to estimate accurately the dimensions of horticultural products and the plant height of plug seedlings. Moreover, our method is potentially applicable for segmenting objects and for removing outliers from the point cloud data based on the 3D images of horticultural crops.

본 연구는 3차원 영상을 이용하여 원예산물의 크기와 플러그묘의 평균초장을 결정하고자 수행되었다. 3차원 영상을 획득하고자 ToF 카메라와 스테레오비전 카메라를 사용하였다. 본 연구의 3차원 영상 획득용 실험 재료로서 수박, 사과, 배, 단호박, 오렌지의 원예산물과 수박, 토마토 및 고추 플러그묘를 사용하였다. 플러그묘의 평균 초장을 결정하는 지표로서 기존의 측정 기준 대신에 수정초장이 제시되었다. 스테레오비전 영상에 비해서 ToF 영상을 이용한 경우에 원예산물의 크기와 플러그묘의 평균초장 오차가 작게 나타났다. 꼭지가 있는 원예산물을 제외할 경우 ToF 영상을 이용한 원예산물의 둘레와 높이의 오차는 각각 0.0-3.0%, 0.0-4.7%로 나타났다. 또한, 플러그묘의 평균 초장에 대한 오차는 0.0-5.5%로 나타났다. 본 연구를 통해 서 3차원 영상을 이용한 원예산물의 크기와 플러그묘에 대한 초장 추정의 가능성을 확인하였다. 더구나, 본 연구에서 시도된 방법은 3차원 영상으로부터 물체와 배경의 효과적인 분리, 이상치의 제거 등에 활용될 것이다.

Keywords

References

  1. Bassel, G.W. 2015. Accuracy in quantitative 3D image analysis. Plant Cell 27:950-953. doi:10.1105/tpc.114.135061
  2. Besl, P.J. 1988. Active, optical range imaging Sensors. Mach. Vis. Appl. 1:127-152. doi:10.1007/978-1-4612-4532-2_1
  3. Edan, Y., S. Han, and N. Kondo. 2009. Automation in agriculture. p. 1095-1128. In: Nof, S.Y. (ed.), Handbook of automation. Springer, New York, NY.
  4. Gelard, W., A. Herbulot, M. Devy, P. Debaeke, R.F. McCormick, S.K. Truong, and J. Mullet. 2017. Leaves segmentation in 3D point cloud. p. 664-674. In: Blanc-Talon, J., R. Penne, W. Philips, D. Popescu, and P. Scheunders (eds.), Advanced concepts for intelligent vision systems. Springer, New York, NY.
  5. Grift, T. 2008. A review of automation and robotics for the bioindustry. J. Biomechatron. Eng. 1:37-54.
  6. Kim, Y.H 2000. Application of biotechnology in the field of agricultural machinery engineering - Development of closed system for transplant production. J. of the Korean Society for Agricultural Machinery 25:311-326 (in Korean).
  7. Kise, M., Q. Zhang, F. Rovira Mas. 2005. A stereovisionbased crop row detection method for tractor-automated guidance. Biosyst. Eng. 90:357-367. doi:10.1016/j.biosystemseng.2004.12.008
  8. Lee, J.S., H.I. Lee, and Y.H. Kim. 2012. Seedling quality and early yield after transplanting of paprika nursed under lightemitting diodes, fluorescent lamps and natural light. J. of Bio-Environment Control 21:220-227 (in Korean).
  9. Li, D., Y. Cao, X. Tang, S. Yan, and X. Cai. 2018. Leaf segmentation on dense plant point clouds with facet region growing. Sensors 18:3625-3640. doi:10.3390/s18113625
  10. Ma, Y., S. Soatto, J. Kosecka, and S. Sastry. 2004. Representation of a three-dimensional moving scene. p. 15-43. In: An invitation to 3-D vision: From images to geometric models. Springer, New York, NY.
  11. Markovic, V., M. Djurovka, Z. Ilin, and B. Lazic. 2000. Effect of seeding quality on yield characters of plant and fruits of sweet pepper. Acta Horticulturae 533:113-120. doi:10.17660/ActaHortic.2000.533.12
  12. Nakarmi, A. and L. Tang. 2012. Automatic inter-plant spacing sensing at early growth stages using a 3D vision sensor. Comput. Electron. Agric. 82:23-31. doi:10.1016/j.compag.2011.12.011
  13. Nissimov, S., J. Goldberger, and V. Alchanatis. 2015. Obstacle detection in a greenhouse environment using the Kinect sensor. Comput. Electron. Agric. 113:104-115. doi:10.1016/j.compag.2015.02.001
  14. Rusu, R.B., Z.C. Marton, N. Blodow, M. Dolha, and M. Beetz. 2008. Towards 3D point cloud based object maps for household environments. Robot. Auton. Syst. 56:927-941. doi:10.1016/j.robot.2008.08.005
  15. Templin, K., P. Didyk, T. Ritschel, K. Myszkowski, and H.P. Seidel. 2012. Highlight microdisparity for improved gloss depiction. ACM T. Graphic. 31(4):1-5. doi:10.1145/2185520.2185588
  16. Thilakarathne, B.L.S., U.M. Rajagopalan, H. Kadono, and T. Yonekura. 2014. An optical interferometric technique for assessing ozone induced damage and recovery under cumulative exposures for a Japanese rice cultivar. Springerplus 3:89-100. doi:10.1186/2193-1801-3-89
  17. Vazquez-Arellano, M., H.W. Griepentrog, D. Reiser, and D.S. Paraforos. 2016. 3-D imaging systems for agricultural applications-A review. Sensors 16:618-641. doi:10.3390/s16050618
  18. Viazzi, S., C. Bahr, T. van Hertem, A. Schlageter-Tello, C.E.B. Romanini, I. Halachmi, C. Lokhorst, and D. Berckmans. 2014. Comparison of a three-dimensional and two-dimensional camera system for automated measurement of back posture in dairy cows. Comput. Electron. Agric. 100:139-147. doi:10.1016/j.compag.2013.11.005
  19. Wei, J., J.F. Reid, and S. Han. 2005. Obstacle detection using stereo vision to enhance safety of autonomous machines. Trans. Am. Soc. Agric. Eng. 48:2389-2397. doi:10.13031/2013.20078
  20. Yin, X., N. Noguchi, and J. Choi. 2013. Development of a target recognition and following system for a field robot. Comput. Electron. Agric. 98:17-24. doi:10.1016/j.compag.2013.07.005
  21. Zhao, Y., Y. Sun, X. Cai, H. Liu, and P.S. Lammers. 2012. Identify plant drought stress by 3D-based image. J. Integr. Agric. 11:1207-1211. doi:10.1016/S2095-3119(12)60116-6
  22. 농업과학기술 연구조사분석기준. 2012. 농촌진흥청.