• Title/Summary/Keyword: Image Crop

Search Result 217, Processing Time 0.021 seconds

Estimation of two-dimensional position of soybean crop for developing weeding robot (제초로봇 개발을 위한 2차원 콩 작물 위치 자동검출)

  • SooHyun Cho;ChungYeol Lee;HeeJong Jeong;SeungWoo Kang;DaeHyun Lee
    • Journal of Drive and Control
    • /
    • v.20 no.2
    • /
    • pp.15-23
    • /
    • 2023
  • In this study, two-dimensional location of crops for auto weeding was detected using deep learning. To construct a dataset for soybean detection, an image-capturing system was developed using a mono camera and single-board computer and the system was mounted on a weeding robot to collect soybean images. A dataset was constructed by extracting RoI (region of interest) from the raw image and each sample was labeled with soybean and the background for classification learning. The deep learning model consisted of four convolutional layers and was trained with a weakly supervised learning method that can provide object localization only using image-level labeling. Localization of the soybean area can be visualized via CAM and the two-dimensional position of the soybean was estimated by clustering the pixels associated with the soybean area and transforming the pixel coordinates to world coordinates. The actual position, which is determined manually as pixel coordinates in the image was evaluated and performances were 6.6(X-axis), 5.1(Y-axis) and 1.2(X-axis), 2.2(Y-axis) for MSE and RMSE about world coordinates, respectively. From the results, we confirmed that the center position of the soybean area derived through deep learning was sufficient for use in automatic weeding systems.

Estimating vegetation index for outdoor free-range pig production using YOLO

  • Sang-Hyon Oh;Hee-Mun Park;Jin-Hyun Park
    • Journal of Animal Science and Technology
    • /
    • v.65 no.3
    • /
    • pp.638-651
    • /
    • 2023
  • The objective of this study was to quantitatively estimate the level of grazing area damage in outdoor free-range pig production using a Unmanned Aerial Vehicles (UAV) with an RGB image sensor. Ten corn field images were captured by a UAV over approximately two weeks, during which gestating sows were allowed to graze freely on the corn field measuring 100 × 50 m2. The images were corrected to a bird's-eye view, and then divided into 32 segments and sequentially inputted into the YOLOv4 detector to detect the corn images according to their condition. The 43 raw training images selected randomly out of 320 segmented images were flipped to create 86 images, and then these images were further augmented by rotating them in 5-degree increments to create a total of 6,192 images. The increased 6,192 images are further augmented by applying three random color transformations to each image, resulting in 24,768 datasets. The occupancy rate of corn in the field was estimated efficiently using You Only Look Once (YOLO). As of the first day of observation (day 2), it was evident that almost all the corn had disappeared by the ninth day. When grazing 20 sows in a 50 × 100 m2 cornfield (250 m2/sow), it appears that the animals should be rotated to other grazing areas to protect the cover crop after at least five days. In agricultural technology, most of the research using machine and deep learning is related to the detection of fruits and pests, and research on other application fields is needed. In addition, large-scale image data collected by experts in the field are required as training data to apply deep learning. If the data required for deep learning is insufficient, a large number of data augmentation is required.

Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation Fraction

  • Yun, Hee Sup;Park, Soo Hyun;Kim, Hak-Jin;Lee, Wonsuk Daniel;Lee, Kyung Do;Hong, Suk Young;Jung, Gun Ho
    • Journal of Biosystems Engineering
    • /
    • v.41 no.2
    • /
    • pp.126-137
    • /
    • 2016
  • Purpose: The overall objective of this study was to evaluate the vegetation fraction of soybeans, grown under different cropping conditions using an unmanned aerial vehicle (UAV) equipped with a red, green, and blue (RGB) camera. Methods: Test plots were prepared based on different cropping treatments, i.e., soybean single-cropping, with and without herbicide application and soybean and barley-cover cropping, with and without herbicide application. The UAV flights were manually controlled using a remote flight controller on the ground, with 2.4 GHz radio frequency communication. For image pre-processing, the acquired images were pre-treated and georeferenced using a fisheye distortion removal function, and ground control points were collected using Google Maps. Tarpaulin panels of different colors were used to calibrate the multi-temporal images by converting the RGB digital number values into the RGB reflectance spectrum, utilizing a linear regression method. Excess Green (ExG) vegetation indices for each of the test plots were compared with the M-statistic method in order to quantitatively evaluate the greenness of soybean fields under different cropping systems. Results: The reflectance calibration methods used in the study showed high coefficients of determination, ranging from 0.8 to 0.9, indicating the feasibility of a linear regression fitting method for monitoring multi-temporal RGB images of soybean fields. As expected, the ExG vegetation indices changed according to different soybean growth stages, showing clear differences among the test plots with different cropping treatments in the early season of < 60 days after sowing (DAS). With the M-statistic method, the test plots under different treatments could be discriminated in the early seasons of <41 DAS, showing a value of M > 1. Conclusion: Therefore, multi-temporal images obtained with an UAV and a RGB camera could be applied for quantifying overall vegetation fractions and crop growth status, and this information could contribute to determine proper treatments for the vegetation fraction.

Extraction of Agricultural Land Use and Crop Growth Information using KOMPSAT-3 Resolution Satellite Image (KOMPSAT-3급 위성영상을 이용한 농업 토지이용 및 작물 생육정보 추출)

  • Lee, Mi-Seon;Kim, Seong-Joon;Shin, Hyoung-Sub;Park, Jin-Ki;Park, Jong-Hwa
    • Korean Journal of Remote Sensing
    • /
    • v.25 no.5
    • /
    • pp.411-421
    • /
    • 2009
  • This study refers to develop a semi-automatic extraction of agricultural land use and vegetation information using high resolution satellite images. Data of IKONOS-2 satellite images (May 25 of 2001, December 25 of 2001, and October 23 of 2003), QuickBird-2 satellite images (May 1 of 2006 and November 17 of 2004) and KOMPSAT-2 satellite image (September 17 of 2007) which resemble with the spatial resolution and spectral characteristics of KOMPSAT-3 were used. The precise agricultural land use classification was tried using ISODATA unsupervised classification technique, and the result was compared with on-screen digitizing land use accompanying with field investigation. For the extraction of crop growth information, three crops of paddy, com and red pepper were selected, and the spectral characteristics were collected during each growing period using ground spectroradiometer. The vegetation indices viz. RVI, NDVI, ARVI, and SAVI for the crops were evaluated. The evaluation process was developed using the ERDAS IMAGINE Spatial Modeler Tool.

Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Classification Upland Crop in Small Scale Agricultural Land (무인항공기와 딥러닝(UNet)을 이용한 소규모 농지의 밭작물 분류)

  • Choi, Seokkeun;Lee, Soungki;Kang, Yeonbin;Choi, Do Yeon;Choi, Juweon
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.38 no.6
    • /
    • pp.671-679
    • /
    • 2020
  • In order to increase the food self-sufficiency rate, monitoring and analysis of crop conditions in the cultivated area is important, and the existing measurement methods in which agricultural personnel perform measurement and sampling analysis in the field are time-consuming and labor-intensive for this reason inefficient. In order to overcome this limitation, it is necessary to develop an efficient method for monitoring crop information in a small area where many exist. In this study, RGB images acquired from unmanned aerial vehicles and vegetation index calculated using RGB image were applied as deep learning input data to classify complex upland crops in small farmland. As a result of each input data classification, the classification using RGB images showed an overall accuracy of 80.23% and a Kappa coefficient of 0.65, In the case of using the RGB image and vegetation index, the additional data of 3 vegetation indices (ExG, ExR, VDVI) were total accuracy 89.51%, Kappa coefficient was 0.80, and 6 vegetation indices (ExG, ExR, VDVI, RGRI, NRGDI, ExGR) showed 90.35% and Kappa coefficient of 0.82. As a result, the accuracy of the data to which the vegetation index was added was relatively high compared to the method using only RGB images, and the data to which the vegetation index was added showed a significant improvement in accuracy in classifying complex crops.

Development of Score-based Vegetation Index Composite Algorithm for Crop Monitoring (농작물 모니터링을 위한 점수기반 식생지수 합성기법의 개발)

  • Kim, Sun-Hwa;Eun, Jeong
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1343-1356
    • /
    • 2022
  • Clouds or shadows are the most problematic when monitoring crops using optical satellite images. To reduce this effect, a composite algorithm was used to select the maximum Normalized Difference Vegetation Index (NDVI) for a certain period. This Maximum NDVI Composite (MNC) method reduces the influence of clouds, but since only the maximum NDVI value is used for a certain period, it is difficult to show the phenomenon immediately when the NDVI decreases. As a way to maintain the spectral information of crop as much as possible while minimizing the influence of clouds, a Score-Based Composite (SBC) algorithm was proposed, which is a method of selecting the most suitable pixels by defining various environmental factors and assigning scores to them when compositing. In this study, the Sentinel-2A/B Level 2A reflectance image and cloud, shadow, Aerosol Optical Thickness(AOT), obtainging date, sensor zenith angle provided as additional information were used for the SBC algorithm. As a result of applying the SBC algorithm with a 15-day and a monthly period for Dangjin rice fields and Taebaek highland cabbage fields in 2021, the 15-day period composited data showed faster detailed changes in NDVI than the monthly composited results, except for the rainy season affected by clouds. In certain images, a spatially heterogeneous part is seen due to partial date-by-date differences in the composited NDVI image, which is considered to be due to the inaccuracy of the cloud and shadow information used. In the future, we plan to improve the accuracy of input information and perform quantitative comparison with MNC-based composite algorithm.

A Study on the AI Analysis of Crop Area Data in Aquaponics (아쿠아포닉스 환경에서의 작물 면적 데이터 AI 분석 연구)

  • Eun-Young Choi;Hyoun-Sup Lee;Joo Hyoung Cha;Lim-Gun Lee
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.861-866
    • /
    • 2023
  • Unlike conventional smart farms that require chemical fertilizers and large spaces, aquaponics farming, which utilizes the symbiotic relationship between aquatic organisms and crops to grow crops even in abnormal environments such as environmental pollution and climate change, is being actively researched. Different crops require different environments and nutrients for growth, so it is necessary to configure the ratio of aquatic organisms optimized for crop growth. This study proposes a method to measure the degree of growth based on area and volume using image processing techniques in an aquaponics environment. Tilapia, carp, catfish, and lettuce crops, which are aquatic organisms that produce organic matter through excrement, were tested in an aquaponics environment. Through 2D and 3D image analysis of lettuce and real-time data analysis, the growth degree was evaluated using the area and volume information of lettuce. The results of the experiment proved that it is possible to manage cultivation by utilizing the area and volume information of lettuce. It is expected that it will be possible to provide production prediction services to farmers by utilizing aquatic life and growth information. It will also be a starting point for solving problems in the changing agricultural environment.

Estimation of Rice Canopy Height Using Terrestrial Laser Scanner (레이저 스캐너를 이용한 벼 군락 초장 추정)

  • Dongwon Kwon;Wan-Gyu Sang;Sungyul Chang;Woo-jin Im;Hyeok-jin Bak;Ji-hyeon Lee;Jung-Il Cho
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.25 no.4
    • /
    • pp.387-397
    • /
    • 2023
  • Plant height is a growth parameter that provides visible insights into the plant's growth status and has a high correlation with yield, so it is widely used in crop breeding and cultivation research. Investigation of the growth characteristics of crops such as plant height has generally been conducted directly by humans using a ruler, but with the recent development of sensing and image analysis technology, research is being attempted to digitally convert growth measurement technology to efficiently investigate crop growth. In this study, the canopy height of rice grown at various nitrogen fertilization levels was measured using a laser scanner capable of precise measurement over a wide range, and a comparative analysis was performed with the actual plant height. As a result of comparing the point cloud data collected with a laser scanner and the actual plant height, it was confirmed that the estimated plant height measured based on the average height of the top 1% points showed the highest correlation with the actual plant height (R2 = 0.93, RMSE = 2.73). Based on this, a linear regression equation was derived and used to convert the canopy height measured with a laser scanner to the actual plant height. The rice growth curve drawn by combining the actual and estimated plant height collected by various nitrogen fertilization conditions and growth period shows that the laser scanner-based canopy height measurement technology can be effectively utilized for assessing the plant height and growth of rice. In the future, 3D images derived from laser scanners are expected to be applicable to crop biomass estimation, plant shape analysis, etc., and can be used as a technology for digital conversion of conventional crop growth assessment methods.

Towards Real Time Detection of Rice Weed in Uncontrolled Crop Conditions (통제되지 않는 농작물 조건에서 쌀 잡초의 실시간 검출에 관한 연구)

  • Umraiz, Muhammad;Kim, Sang-cheol
    • Journal of Internet of Things and Convergence
    • /
    • v.6 no.1
    • /
    • pp.83-95
    • /
    • 2020
  • Being a dense and complex task of precisely detecting the weeds in practical crop field environment, previous approaches lack in terms of speed of processing image frames with accuracy. Although much of the attention has been given to classify the plants diseases but detecting crop weed issue remained in limelight. Previous approaches report to use fast algorithms but inference time is not even closer to real time, making them impractical solutions to be used in uncontrolled conditions. Therefore, we propose a detection model for the complex rice weed detection task. Experimental results show that inference time in our approach is reduced with a significant margin in weed detection task, making it practically deployable application in real conditions. The samples are collected at two different growth stages of rice and annotated manually

Application of Highland Kimchi Cabbage Status Map for Growth Monitoring based on Unmanned Aerial Vehicle

  • Na, Sang-Il;Park, Chan-Won;Lee, Kyung-Do
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.49 no.5
    • /
    • pp.469-479
    • /
    • 2016
  • Kimchi cabbage is one of the most important vegetables in Korea and a target crop for market stabilization as well. In particular Kimchi cabbages in a highland area are very sensitive to the fluctuations in supply and demand. Yield variability due to growth conditions dictates the market fluctuations of Kimchi cabbage price. This study was carried out to understand the distribution of the highland Kimchi cabbage growth status in Anbandeok. Anbandeok area in Gangneung, Gangwon-do, Korea is one of the main producing districts of highland Kimchi cabbage. The highland Kimchi cabbage status map of each growth factor was obtained from unmanned aerial vehicle (UAV) imagery and field survey data. Six status maps include UAVRGB image map, normalized difference vegetation index (NDVI) distribution/anomaly map, Crop distribution map, Planting/Harvest distribution map, Growth parameter map and Growth disorder map. As a result, the highland Kimchi cabbage status maps from May 31 to Sep. 6 in 2016 were presented to show spatial variability in the field. The benefits of the highland Kimchi cabbage status map can be summarized as follows: crop growth monitoring, reference for field observations and survey, the relative comparison of the growth condition in field scale, evaluation of growth in comparison of average year, change detection of annual crops or planting areas, abandoned fields monitoring, prediction of harvest season etc.