• Title/Summary/Keyword: Plant Village Dataset

Search Result 8, Processing Time 0.022 seconds

An Efficient Disease Inspection Model for Untrained Crops Using VGG16 (VGG16을 활용한 미학습 농작물의 효율적인 질병 진단 모델)

  • Jeong, Seok Bong;Yoon, Hyoup-Sang
    • Journal of the Korea Society for Simulation
    • /
    • v.29 no.4
    • /
    • pp.1-7
    • /
    • 2020
  • Early detection and classification of crop diseases play significant role to help farmers to reduce disease spread and to increase agricultural productivity. Recently, many researchers have used deep learning techniques like convolutional neural network (CNN) classifier for crop disease inspection with dataset of crop leaf images (e.g., PlantVillage dataset). These researches present over 90% of classification accuracy for crop diseases, but they have ability to detect only the pre-trained diseases. This paper proposes an efficient disease inspection CNN model for new crops not used in the pre-trained model. First, we present a benchmark crop disease classifier (CDC) for the crops in PlantVillage dataset using VGG16. Then we build a modified crop disease classifier (mCDC) to inspect diseases for untrained crops. The performance evaluation results show that the proposed model outperforms the benchmark classifier.

Unsupervised Transfer Learning for Plant Anomaly Recognition

  • Xu, Mingle;Yoon, Sook;Lee, Jaesu;Park, Dong Sun
    • Smart Media Journal
    • /
    • v.11 no.4
    • /
    • pp.30-37
    • /
    • 2022
  • Disease threatens plant growth and recognizing the type of disease is essential to making a remedy. In recent years, deep learning has witnessed a significant improvement for this task, however, a large volume of labeled images is one of the requirements to get decent performance. But annotated images are difficult and expensive to obtain in the agricultural field. Therefore, designing an efficient and effective strategy is one of the challenges in this area with few labeled data. Transfer learning, assuming taking knowledge from a source domain to a target domain, is borrowed to address this issue and observed comparable results. However, current transfer learning strategies can be regarded as a supervised method as it hypothesizes that there are many labeled images in a source domain. In contrast, unsupervised transfer learning, using only images in a source domain, gives more convenience as collecting images is much easier than annotating. In this paper, we leverage unsupervised transfer learning to perform plant disease recognition, by which we achieve a better performance than supervised transfer learning in many cases. Besides, a vision transformer with a bigger model capacity than convolution is utilized to have a better-pretrained feature space. With the vision transformer-based unsupervised transfer learning, we achieve better results than current works in two datasets. Especially, we obtain 97.3% accuracy with only 30 training images for each class in the Plant Village dataset. We hope that our work can encourage the community to pay attention to vision transformer-based unsupervised transfer learning in the agricultural field when with few labeled images.

Plants Disease Phenotyping using Quinary Patterns as Texture Descriptor

  • Ahmad, Wakeel;Shah, S.M. Adnan;Irtaza, Aun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.8
    • /
    • pp.3312-3327
    • /
    • 2020
  • Plant diseases are a significant yield and quality constraint for farmers around the world due to their severe impact on agricultural productivity. Such losses can have a substantial impact on the economy which causes a reduction in farmer's income and higher prices for consumers. Further, it may also result in a severe shortage of food ensuing violent hunger and starvation, especially, in less-developed countries where access to disease prevention methods is limited. This research presents an investigation of Directional Local Quinary Patterns (DLQP) as a feature descriptor for plants leaf disease detection and Support Vector Machine (SVM) as a classifier. The DLQP as a feature descriptor is specifically the first time being used for disease detection in horticulture. DLQP provides directional edge information attending the reference pixel with its neighboring pixel value by involving computation of their grey-level difference based on quinary value (-2, -1, 0, 1, 2) in 0°, 45°, 90°, and 135° directions of selected window of plant leaf image. To assess the robustness of DLQP as a texture descriptor we used a research-oriented Plant Village dataset of Tomato plant (3,900 leaf images) comprising of 6 diseased classes, Potato plant (1,526 leaf images) and Apple plant (2,600 leaf images) comprising of 3 diseased classes. The accuracies of 95.6%, 96.2% and 97.8% for the above-mentioned crops, respectively, were achieved which are higher in comparison with classification on the same dataset using other standard feature descriptors like Local Binary Pattern (LBP) and Local Ternary Patterns (LTP). Further, the effectiveness of the proposed method is proven by comparing it with existing algorithms for plant disease phenotyping.

Novel Category Discovery in Plant Species and Disease Identification through Knowledge Distillation

  • Jiuqing Dong;Alvaro Fuentes;Mun Haeng Lee;Taehyun Kim;Sook Yoon;Dong Sun Park
    • Smart Media Journal
    • /
    • v.13 no.7
    • /
    • pp.36-44
    • /
    • 2024
  • Identifying plant species and diseases is crucial for maintaining biodiversity and achieving optimal crop yields, making it a topic of significant practical importance. Recent studies have extended plant disease recognition from traditional closed-set scenarios to open-set environments, where the goal is to reject samples that do not belong to known categories. However, in open-world tasks, it is essential not only to define unknown samples as "unknown" but also to classify them further. This task assumes that images and labels of known categories are available and that samples of unknown categories can be accessed. The model classifies unknown samples by learning the prior knowledge of known categories. To the best of our knowledge, there is no existing research on this topic in plant-related recognition tasks. To address this gap, this paper utilizes knowledge distillation to model the category space relationships between known and unknown categories. Specifically, we identify similarities between different species or diseases. By leveraging a fine-tuned model on known categories, we generate pseudo-labels for unknown categories. Additionally, we enhance the baseline method's performance by using a larger pre-trained model, dino-v2. We evaluate the effectiveness of our method on the large plant specimen dataset Herbarium 19 and the disease dataset Plant Village. Notably, our method outperforms the baseline by 1% to 20% in terms of accuracy for novel category classification. We believe this study will contribute to the community.

Tomato Crop Disease Classification Using an Ensemble Approach Based on a Deep Neural Network (심층 신경망 기반의 앙상블 방식을 이용한 토마토 작물의 질병 식별)

  • Kim, Min-Ki
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.10
    • /
    • pp.1250-1257
    • /
    • 2020
  • The early detection of diseases is important in agriculture because diseases are major threats of reducing crop yield for farmers. The shape and color of plant leaf are changed differently according to the disease. So we can detect and estimate the disease by inspecting the visual feature in leaf. This study presents a vision-based leaf classification method for detecting the diseases of tomato crop. ResNet-50 model was used to extract the visual feature in leaf and classify the disease of tomato crop, since the model showed the higher accuracy than the other ResNet models with different depths. We propose a new ensemble approach using several DCNN classifiers that have the same structure but have been trained at different ranges in the DCNN layers. Experimental result achieved accuracy of 97.19% for PlantVillage dataset. It validates that the proposed method effectively classify the disease of tomato crop.

Performance Comparison of Base CNN Models in Transfer Learning for Crop Diseases Classification (농작물 질병분류를 위한 전이학습에 사용되는 기초 합성곱신경망 모델간 성능 비교)

  • Yoon, Hyoup-Sang;Jeong, Seok-Bong
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.44 no.3
    • /
    • pp.33-38
    • /
    • 2021
  • Recently, transfer learning techniques with a base convolutional neural network (CNN) model have widely gained acceptance in early detection and classification of crop diseases to increase agricultural productivity with reducing disease spread. The transfer learning techniques based classifiers generally achieve over 90% of classification accuracy for crop diseases using dataset of crop leaf images (e.g., PlantVillage dataset), but they have ability to classify only the pre-trained diseases. This paper provides with an evaluation scheme on selecting an effective base CNN model for crop disease transfer learning with regard to the accuracy of trained target crops as well as of untrained target crops. First, we present transfer learning models called CDC (crop disease classification) architecture including widely used base (pre-trained) CNN models. We evaluate each performance of seven base CNN models for four untrained crops. The results of performance evaluation show that the DenseNet201 is one of the best base CNN models.

Tomato Crop Diseases Classification Models Using Deep CNN-based Architectures (심층 CNN 기반 구조를 이용한 토마토 작물 병해충 분류 모델)

  • Kim, Sam-Keun;Ahn, Jae-Geun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.22 no.5
    • /
    • pp.7-14
    • /
    • 2021
  • Tomato crops are highly affected by tomato diseases, and if not prevented, a disease can cause severe losses for the agricultural economy. Therefore, there is a need for a system that quickly and accurately diagnoses various tomato diseases. In this paper, we propose a system that classifies nine diseases as well as healthy tomato plants by applying various pretrained deep learning-based CNN models trained on an ImageNet dataset. The tomato leaf image dataset obtained from PlantVillage is provided as input to ResNet, Xception, and DenseNet, which have deep learning-based CNN architectures. The proposed models were constructed by adding a top-level classifier to the basic CNN model, and they were trained by applying a 5-fold cross-validation strategy. All three of the proposed models were trained in two stages: transfer learning (which freezes the layers of the basic CNN model and then trains only the top-level classifiers), and fine-tuned learning (which sets the learning rate to a very small number and trains after unfreezing basic CNN layers). SGD, RMSprop, and Adam were applied as optimization algorithms. The experimental results show that the DenseNet CNN model to which the RMSprop algorithm was applied output the best results, with 98.63% accuracy.

A Study on the Deep Learning-Based Tomato Disease Diagnosis Service (딥러닝기반 토마토 병해 진단 서비스 연구)

  • Jo, YuJin;Shin, ChangSun
    • Smart Media Journal
    • /
    • v.11 no.5
    • /
    • pp.48-55
    • /
    • 2022
  • Tomato crops are easy to expose to disease and spread in a short period of time, so late measures against disease are directly related to production and sales, which can cause damage. Therefore, there is a need for a service that enables early prevention by simply and accurately diagnosing tomato diseases in the field. In this paper, we construct a system that applies a deep learning-based model in which ImageNet transition is learned in advance to classify and serve nine classes of tomatoes for disease and normal cases. We use the input of MobileNet, ResNet, with a deep learning-based CNN structure that builds a lighter neural network using a composite product for the image set of leaves classifying tomato disease and normal from the Plant Village dataset. Through the learning of two proposed models, it is possible to provide fast and convenient services using MobileNet with high accuracy and learning speed.