DOI QR코드

DOI QR Code

Leveraging Deep Learning and Farmland Fertility Algorithm for Automated Rice Pest Detection and Classification Model

  • Hussain. A (Department of Networking and Communications, College of Engineering and Technology (CET), SRM Institute of Science and Technology) ;
  • Balaji Srikaanth. P (Department of Networking and Communications, College of Engineering and Technology (CET), SRM Institute of Science and Technology)
  • Received : 2023.11.22
  • Accepted : 2024.03.17
  • Published : 2024.04.30

Abstract

Rice pest identification is essential in modern agriculture for the health of rice crops. As global rice consumption rises, yields and quality must be maintained. Various methodologies were employed to identify pests, encompassing sensor-based technologies, deep learning, and remote sensing models. Visual inspection by professionals and farmers remains essential, but integrating technology such as satellites, IoT-based sensors, and drones enhances efficiency and accuracy. A computer vision system processes images to detect pests automatically. It gives real-time data for proactive and targeted pest management. With this motive in mind, this research provides a novel farmland fertility algorithm with a deep learning-based automated rice pest detection and classification (FFADL-ARPDC) technique. The FFADL-ARPDC approach classifies rice pests from rice plant images. Before processing, FFADL-ARPDC removes noise and enhances contrast using bilateral filtering (BF). Additionally, rice crop images are processed using the NASNetLarge deep learning architecture to extract image features. The FFA is used for hyperparameter tweaking to optimise the model performance of the NASNetLarge, which aids in enhancing classification performance. Using an Elman recurrent neural network (ERNN), the model accurately categorises 14 types of pests. The FFADL-ARPDC approach is thoroughly evaluated using a benchmark dataset available in the public repository. With an accuracy of 97.58, the FFADL-ARPDC model exceeds existing pest detection methods.

Keywords

1. Introduction

Rice is the most important world food crop for human consumption and remains the keystone of global food safety. The growth cycle of rice is followed by the incidence of various pests, which result in severe yield losses [1]. Correct detection of rice pests helps early preventive processes for avoiding financial losses [2]. Earlier rice pest observing techniques mostly utilize trapping lights for capturing pests that are further counted, retrieved and detected the following day. The conventional approach to detecting rice disease includes visual analysis by plant protection experts [3]. But, this technique is time-taking and largely depends on subjective assessment, creating it hard to quickly control and avoid rice diseases. With the fast development of software and hardware technologies, the application of artificial intelligence (AI), image identification, big data, and deep learning (DL) have been more frequent in the domain of agriculture, especially in crop disease detection and identification. If diseases can affect rice plants, their morphological characteristics and physiological structures are modified leading to signs such as decay, deformation, and leaf discoloration [4]. This laborious method is implemented manually in terms of developing the recent pest condition and gives area for misdiagnosis and delays in diagnosing period [5]. With the gradual growth of DL and ML, enhancing researcher counts are an attempt to increase these new methods in the domain of pest detection.

Conventionally, the recognition and classification of rice pests and diseases depend on manual visual analysis that is time taking, inaccuracy, and ineffective [6]. These inefficient standard manual analysis techniques prevent adequate rice production from attaining social requirements, requiring the emergence of new identification methods [7]. Developments in computer technology are provided to increase conventional image-processing approaches, comprising detection, feature extraction, reduced dimensionality, and image pre-processing [8]. These techniques are employed for agricultural images for different resolutions, for instance, classification of crop pest and disease sites and varieties. DL-based target detection and classification algorithms are newly developed in traditional investigation, modifying classical image detection methods [9]. These techniques enable machines to adjustably learn image features with no manual feature extraction, therefore, allowing proficient implementation of identification and classification tasks. Deep convolutional networks can be effectually enhanced migration among rice diseases and pests [10].

This article offers the design of a new farmland fertility algorithm with deep learning-based automated rice pest detection and classification (FFADL-ARPDC) technique. The FFADL-ARPDC technique performs bilateral filtering (BF)-based noise removal and contrast enhancement as a pre-processing stage. In addition, the NASNetLarge-DL architecture is employed to extract meaningful features from rice crop images. To optimize the model performance of the NASNetLarge, the FFA is utilized for the hyperparameter tuning process, which helps in improving the classification performance. Finally, the Elman recurrent neural network (ERNN) is used for pest classification, enabling the model to recognize 14 diverse pest species accurately. The FFADL-ARPDC approach is thoroughly evaluated using IP102-dataset from the Kaggle repository.

2. Related Works

Gong et al. [11] presented a method for automatic rice pest classification and detection that depends on FCNs and chooses 10 rice pests for experiments. Primary, the authors present a novel encoding–decoding from the FCN and a sequence of sub-networks linked by jump paths to integrate long jumps and shortcut connections for correct and fine-grained insect boundary recognition. Secondary, the network also combines a CRF module for insect contour refinement and boundary localization; at last, a new DenseNet design that establishes an attention mechanism (ECA) has been presented to concentrate on extracting insect edge features for effectual rice pest classification. Rahman et al. [12] constructed a DL-based model for identifying pests and diseases in the images of rice plants. Primarily, state-of-the-art large-scale construction like InceptionV3 and VGG16 are fine-tuned and adopted for identifying and recognizing diseases and pests of rice plants. Later, dual-phase small CNN approaches are presented, and related with the modern memory effectual CNN methods like SqueezeNet, MobileNet, and NasNet Mobile.

Thenmozhi and Reddy [13] introduced a DCNN method for classifying insect species. TL is enforced for fine-tuning the pre-trained techniques. The data augmenting methods like rotation, translation, reflection, and scaling are also enforced for preventing the network from the issue of overfitting. The presented method is related and analyzed with pre-trained DL constructions like GoogLeNet, AlexNet, VGGNet, and ResNet for classifying insects. The presented method is examined by employing the NBAIR dataset. Prasath and Akila [14] introduced an approach called Yolov3 for recognizing pests in plants and the conceded neurons are maximized by the method called Adaptive Energy-based Harris Hawks Optimizer (AE-HHO). A deep feature-extracting process is employed with VGG16 and ResNet50 methods. The classifying process is accomplished by the approach called Weight Optimized DNN (WO-DNN). This WO-DNN approach of the weight feature is maximized by employing the AE-HHO method.

V. Malathi and M. P. Gopinath [15] proposed a model for the classification of Ten insect types in the rice crop that were identified using deep convolutional neural networks (DCNN). Different DCNN architectures were used to create the models, with the results being interpreted based on their accuracy and performance. By adjusting the ResNet-50 model's hyperparameters and layers, we successfully applied the transfer learning strategy to the pest dataset. ResNet-50 was fine-tuned to outperform other models, and it now has an accuracy of 95.012%. Data augmentation was discussed in the research to expand the dataset, and pre-trained architecture models for pest recognition were also considered. By comparing the anticipated and observed models, we determined the loss function and applied it using ADAM optimization with fine-tuned parameters. Based on findings from other articles, the authors of this one underlined the usefulness of deep learning models and transfer learning for categorizing insect pests.

In [16], an IoT-based classifying and recognizing pest model is presented. Primarily, the IoT sensors are employed for accomplishing the object recognition which is achieved by the model Yolov3. Additionally, the recognized imageries are given as input to the CNN method, where the deep aspects are taken and sent as input to Convolution Neural LSTM (CNLSTM) method. Few hyperparameters are maximally tuned by Adaptive Honey Badger Algorithm (AHBA) technique. Ayan et al. [17] altered and re-trained 7 diverse pre-training CNN methods such as Xception, VGG-16, SqueezeNet, ResNet50, InceptionV3, MobileNet, and VGG-19 by employing fitting fine-tuning policies and Transfer Learning (TL) models on open dataset D0. Then, the CNN approaches such as MobileNet, Inception-V3, and Xception are collected through the Sum of Maximum probability policy, the method was called the SMPEnsemble. Later, these methods are gathered by implementing weighted voting. H. Peng et al. [18] have created HQIP102, a massive dataset on insect pests that includes 47,393 photos over 102 different pest classes on eight different crop types. The HQIP102 dataset was obtained by meticulously filtering the IP102 dataset to remove incorrect pest classifications and images lacking pest subjects. To enhance DenseNet's representation capacity, the MADN framework was proposed. It presents the Selective Kernel unit to modify the size of the field of reception adaptively, the Representative Batch Normalisation module to ensure the features adhere to a constant distribution throughout, and the ACON activation function to activate neurons systematically. Ensemble learning is what makes up the MADN model. Several metrics assessed MADN's efficacy, including precision, recall, accuracy, F1Score. The MADN model outperformed competing models in accuracy and F1Score, although using fewer parameters than ResNet-101 and using more GPU RAM and a more extended training period. The summary of the related study is shown in Table 1.

Table 1. Summary of the related study

E1KOBZ_2024_v18n4_959_4_t0001.png 이미지

E1KOBZ_2024_v18n4_959_5_t0001.png 이미지

2.1 Problem Statement

i. Several approaches have large computing needs because their designs are complicated.

ii. Model performance, especially when using transfer learning methods, is highly dependent on the quality and variety of the datasets used in such methods.

iii. Research shows that the employment of a wide variety of models and optimisation approaches presents difficulties in model selection and parameter tweaking.

iv. Higher GPU memory and extended training durations are the consequences of using ensemble learning techniques.

v. Precise parameter adjustment is required when utilising complicated optimisation techniques.

3. The Proposed Model

In this study, we focus on the design and development of automated rice pest detection and classification using the FFADL-ARPDC technique. The major intention of the FFADL-ARPDC model is to exploit the DL model with a hyperparameter tuning strategy for the identification and categorization of rice pests into different classes. To accomplish this, the FFADL-ARPDC method encompasses pre-processing, NASNetLarge feature extraction, FFA-based hyperparameter tuning, and ERNN-based classification.

3.1 Image Pre-processing

For pre-processing the input images, BF-based noise removal and Contrast Limited Adaptive Histogram Equalization (CLAHE) based contrast enhancement are applied. BF is established for smooth images while preserving the fine details and edges in the image [19]. It takes into account both the intensity similarity and spatial proximity between pixels while using the filter, which makes it efficient in decreasing noise without blurring edges. Fig. 1 depicts the entire flow of the FFADL-ARPDC algorithm.

E1KOBZ_2024_v18n4_959_6_f0001.png 이미지

Fig. 1. Overall flow of FFADL-ARPDC algorithm.

The fundamental notion behind BF is to calculate the weighted average of adjacent pixel value, where the weight is defined by the intensity difference and the spatial distance between its neighbours and the center pixel. Following, CLAHE is an image processing method used for enhancing the image contrast while avoiding the challenges of over-amplifying noise in uniform regions. It is an extension of the classical Histogram Equalization (HE) technique. CLAHE addresses the limitation of classical HE by implementing the equalization process on smaller, overlapping image regions, called blocks or tiles. This local strategy assists in avoiding artifacts excessive and intensification of noise. Every tile's histogram is equalized, but there is a restriction imposed on how much the histogram can be stretched. This constraint is the "contrast limiting" aspect of CLAHE. When the count of the histogram bin exceeds the limit, then the excess values are redistributed to nearby bins, which prevents drastic amplification.

The CLAHE approach functions on a lesser area of images which is called a tile [20]. The contrast restricted in tile can be fine-tuned so that the histogram created from that region matches the given histogram shape. The neighboring tile has been connected using bilinear interpolation. This method is done such that the outcomes of the integration of the tiles look smoother.

\(\begin{align}\beta=\frac{M}{N}\left(1+\frac{\alpha}{100}\left(S_{\max }-1\right)\right)\end{align}\)       (1)

In Eq. (1), 𝛽 implied the limit value (clip limit), Smax stands for the maximal suitable slope. The clip factor 𝛼 depicts the addition of histogram limit values from 1 and 100. 𝑀 variable represents the region size, and 𝑁 denotes the grey level value (256).

3.2 Feature Extraction using NASNetLarge

In this work, the NASNetLarge deep learning architecture is employed to extract meaningful features from rice crop images. Neural Architecture Search Network (NASNet) is a family of NN architecture that was created by an automatic method named called NAS architecture [21]. The objective of NAS is to automatically discover the optimum network architecture for certain tasks, which relieves human engineers from the labour-intensive task of manually developing architecture. NASNetLarge is the neural architecture that was designed by the NAS architecture. It is well-known for its performance on image classification tasks and the ability and complexity to accomplish higher accuracy on different benchmark data. NASNetLarge was discovered by an automatic process that searched through a large space of possible architectures Rather than manually designing the architecture. This process optimizes both model accuracy and computation efficacy. It comprises repeating cells, each having a stack of operations that are connected. They are organized in a hierarchy to generate the full network. Also, it consists of a skip connection, otherwise called a residual connection. This connection helps in gradient flow during training and could facilitate the learning of complicated features. Typically, the last layer of the NASNetLarge includes global average pooling that decreases the spatial dimension of the mapping feature and assists in creating a fixed-size representation for classification. NASNetLarge exploits different pooling functions including max and avg pooling for capturing various stages of data from the mapping feature.

3.3 Hyperparameter Tuning using FFA

The FFA is exploited for the NASNetLarge method to adjust the hyperparameter process. FFA is based on the natural fertility of the farmland and attempts to enhance every solution through the separation of farmland and the desirable productivity of internal and external memory [22]. The FFA method has the best exploration and convergence strategy due to external memory (global solution and) internal memory (local solution) in optimizer issues including data clustering. Furthermore, the FFA technique exploits a global solution that ensures model convergence. Thus, this algorithm doesn’t need to be changed dramatically unlike the approaches studied in the background. Also, the outcomes prove that the solution to the problems of data clustering is of improved quality than other techniques. However, the shortcoming of the FFA model in data clustering problems is that similar to metaheuristic systems, by enhancing the dimension of the problems, the algorithm might have lost its efficacy to a certain degree, thus the parameter of the FFA model should be more carefully selected. The step by step and formulation of the FFA model is given below. At first, an initial population was generated using the following expression:

𝑁 = 𝑘 ∗ 𝑛       (2)

Where 𝑘 defines the amount of sections of space or land, and 𝑛 shows the amount of performances existing in every region of farmland. A single section is regarded to define k’s value. The optimum values are considered as 𝑘 within [2,8] [3]. All the areas of farmland’s quality were attained by the average of the solution existing in every region of farmland. At first, Eq. (3) allows solutions to all the areas.

Sections = 𝑥(𝑎j), 𝑎 = 𝑛 ∗ (𝑠 − 1): 𝑛 ∗ 𝑠 𝑠 = 1,2, … 𝑘,𝑗 = 1,2, … 𝐷       (3)

Where 𝑠 denotes the amount of sections and 𝑗 = 1,2, … , 𝐷 shows the dimension of parameter 𝑥(aj) that each solution in the searching space. This formula separates present solutions of every segment in a simple manner to separately evaluate the section average. Eq. (4) defines the quality of every section after segmentation.

Fit_Sections = Mean(all Fit(xij) in Sections) 𝑠 = 1.2 … 𝑘, 𝑖 = 1.2 … 𝑛       (4)

Where Fit_Sections defines the quality of outreached section of the solution. The amount of better local memory to Eq. (6) and the amount of optimum memory defines the local as well as global memory of all the sections based on Eq. (5).

𝑀_Global = round(𝑡 ∗ 𝑁) 0. 1 < 𝑡 < 1       (5)

𝑀_Local = round(𝑡 ∗ 𝑛) 0. 1 < 𝑡 < 1       (6)

Where, 𝑀_Global and 𝑀_Local indicate the solution counts in global as well as local memory, correspondingly. Furthermore, local and international memories can be upgraded. According to Eqs. (7) and (8), each solution has the worse section of farmland that is hybridized with the global memory solution.

h = 𝛼 ∗ rand(−1,1)       (7)

Xnew = ℎ ∗ (Xij − 𝑋𝑀_Global) + Xij       (8)

In Eqs. (7) and (8), 𝛼 refers to a random integer within [0,1]. 𝑋𝑀_Global shows the randomly generated solution amongst the available solutions in the global memory, Xij denotes the selected solution in the worst section for making some modifications, and also ℎ indicates the decimal number. Based on Eqs. (9) and (10), the available solutions in the other sections also change.

ℎ = 𝛽 ∗ rand(−1,1)       (9)

Xnew = ℎ ∗ (Xij − 𝑋uj) + Xij       (10)

In the Equation, 𝑋uj denotes the random path selected between each solution in the section, and 𝛽 indicates the number within [0,1]. Xij shows the performance in the section chosen to execute the changes, and ℎ denotes a decimal number evaluated using Eq. (10). The combination of the solution considered with BestGlobal or BestLocal is defined using Eq. (11) for upgrading outcomes based on local as well as global memory.

\(\begin{align}H=\left\{\begin{array}{cc}X_{\text {new }}=X_{i j}+\omega_{1} *\left(X_{i j}-\text { Best }_{\text {Local }}(b)\right): & Q>\operatorname{rand}(0,1) \\ X_{\text {new }}=X_{i j}+\operatorname{rand}(0,1) *\left(X_{i j}-\text { Best }_{\text {Local }}(b)\right): & \text { else }\end{array}\right.\end{align}\)       (11)

Where 𝑡𝑡ℎ𝑒𝑒 𝑄 value ranges within [0,1] that defines the amount of combination of the solutions with BestGlobal . 𝜔1 is a parameter associated with FFA that is an integer. Initially, the algorithm must be indicated, and based on the iteration its amount reduces as follows.

𝜔1 = 𝜔1 ∗ 𝑅𝑣 0 < 𝑅𝑣 < 1       (12)

At the final stage, the termination condition was explored. If the final condition is present, then the algorithm stops. Or else, the process continued the task until the ending criteria were met. The FFA method derives a FF to obtain better performance of classification. It defines a positive integer to characterize the better solution of the solution candidate. The reduction of the classifier error rate was assumed as the FF.

\(\begin{align}\begin{aligned} \text { fitness }\left(x_{i}\right) & =\text { ClassifierErrorRate }\left(x_{i}\right) \\ = & \frac{\text { No.of misclassified instances }}{\text { Total No.of instances }} * 100\end{aligned}\end{align}\)       (13)

3.4 Pest Classification using ERNN Model

In the final stage, the ERNN model can be applied to the recognition and classification of pest classes. Elman in 1990 [23] established the ERNN as a simple RNN. The presented method has the benefits of faster convergence, accurate mapping capability, consuming time series, and nonlinear prediction capabilities. ENN combined with dissimilar areas to its resolve. In this study, the outcome of hidden layers (HL) was allowable to feedback on itself with butter layer is called a recurrent layer (RL). Single RL neurons with the constant weight of one connect each hidden neuron. Consequently, the number of recurrent neurons can be equivalent to hidden neuron counts. All the layers have different neurons that transmit data from one layer to another by calculating a non-linear function of the weighted sum of input. Fig. 2 depicts the infrastructure of ERNN. The multi‐input ERNN method was illustrated where the quantity of neurons in the input layers is 𝑚 and in the HL is 𝑛 and single outcome unit.

E1KOBZ_2024_v18n4_959_9_f0001.png 이미지

Fig. 2. Infrastructure of ERNN.

Consider xit(𝑖 = 1,2, , 𝑚) as a collection of input vectors of neurons at 𝑡 time, 𝑦𝑡+1 represents the outcomes of the network at 𝑡 + 1 time , zjt(𝑗 = 1,2, … , 𝑛) shows the outcomes of HL neurons at 𝑡 time, and (𝑗 = 1,2, … , 𝑛) determines the RL neurons. wij shows the weighted to link the 𝑖𝑡ℎ nodes in the input layer neuron to 𝑗𝑡ℎ nodes from the HLs. 𝑐𝑗, indicates the weight that correspondingly connects the 𝑗𝑡ℎ nodes in the HL neuron to nodes under the output and RL. The HL phase is given in the following: the input of all the neurons from the HL is shown below:

\(\begin{align}(k)=\sum_{i=1}^{n} w_{i j} x(k-1)+\sum_{j=1}^{m} c_{j} u_{j t}(k)\end{align}\)       (14)

ujt(𝑘) = zjt(𝑘 − 1), 𝑖 = 1,2, … , 𝑛,𝑗 = 1,2, … , 𝑚.

The outcome of hidden neuron is given as follows:

\(\begin{align}z_{j t}(k)=f_{H}\left(n e t_{j t}(k)\right) =(\sum_{i=1}^{n} w_{i j}(k)+\sum_{j=1}^{m} c_{j} u_{j}(k))\end{align}\)       (15)

The sigmoid function in HLs has been selected as an activation function: 𝑓𝐻(𝑥) = 1/(1 + 𝑒−𝜒). The outcomes of HL are shown in the following:

\(\begin{align}y_{t+1}(k)=f_{T}\left(\sum_{j=1} v_{j}(k)\right)\end{align}\)       (16)

In Eq. (16), (𝑥) denotes the identity map as an activation function.

4. Experimental Validation

The performance validation of the FFADL-ARPDC approach is tested on IP102-dataset from the Kaggle repository [24]. Fig. 3 illustrates the sample images. Table 2 defines a detailed description of the IP102 dataset.

E1KOBZ_2024_v18n4_959_10_f0001.png 이미지

Fig. 3. Sample images.

Table 2. Description of database

E1KOBZ_2024_v18n4_959_11_t0001.png 이미지

Fig. 4 represents the confusion matrices produced by the FFADL-ARPDC algorithm on varying datasets. The simulation values referred that the FFADL-ARPDC approach has identified and classified all 14 classes accurately.

E1KOBZ_2024_v18n4_959_11_f0001.png 이미지

Fig. 4. Confusion matrices of (a-b) 80:20 of TR set/TS set and (c-d) 70:30 of TR set/TS set

In Table 3 and Fig. 5, the rice pest detection and classification results of the FFADL-ARPDC technique on 80:20 of the TR set/TS set are highlighted. The results imply that the FFADL-ARPDC technique accomplishes effectual results under all classes. On the 80% TR set, the FFADL-ARPDC technique offers an average accuy of 97.58%, precn of 83.17%, recal of 77.43%, and Fscore of 79.55%. At the same time, on the 20% TR set, the FFADL-ARPDC technique offers an average accuy of 97.10%, precn of 77.98%, recal of 73.47%, and Fscore of 75.07%. Fig. 6 demonstrates the training accuracy TR_accuy and VL_accuy of the FFADL-ARPDC technique on 80:20 of the TR set/TS set. The TL_accuy is defined by the evaluation of the FFADL-ARPDC technique on the TR dataset whereas the VL_accuy is computed by evaluating the performance on a separate testing dataset. The results exhibit that TR_accuy and VL_accuy increase with an upsurge in epochs. Consequently, the performance of the FFADL-ARPDC technique gets improved on the TR and TS dataset with an increase in the number of epochs.

Table 3. Rice pest detection and classification outcome of FFADL-ARPDC approach on 80:20 of TR

E1KOBZ_2024_v18n4_959_12_t0001.png 이미지

E1KOBZ_2024_v18n4_959_12_f0001.png 이미지

Fig. 5. Average of FFADL-ARPDC system on 80:20 of TR set/TS set

E1KOBZ_2024_v18n4_959_13_f0001.png 이미지

Fig. 6. Accuy curve of FFADL-ARPDC system on 80:20 of TR set/TS set

In Fig. 7, the TR_loss and VR_loss outcomes of the FFADL-ARPDC technique on 80:20 of the TR set/TS set are shown. The TR_loss defines the error among the predictive outcome and original values on the TR data. The VR_loss represents the measure of the performance of the FFADL-ARPDC technique on individual validation data. The results indicate that the TR_loss and VR_loss tend to decrease with rising epochs. It depicted the enhanced performance of the FFADL-ARPDC system and its capability to create an accurate classification.

E1KOBZ_2024_v18n4_959_13_f0002.png 이미지

Fig. 7. Loss curve of FFADL-ARPDC system on 80:20 of TR set/TS set

The reduced value of TR_loss and VR_loss demonstrates the enhanced performance of the FFADL-ARPDC methodology in capturing patterns and relationships. A detailed PR outcome of the FFADL-ARPDC algorithm is depicted on 80:20 of the TR set/TS set in Fig. 8. The simulation values inferred that the FFADL-ARPDC approach outcomes in greater PR values. Then, it can be noted that the FFADL-ARPDC system attains higher PR values on 14 classes.

E1KOBZ_2024_v18n4_959_14_f0001.png 이미지

Fig. 8. PR curve of FFADL-ARPDC system on 80:20 of TR set/TS set

In Fig. 9, a ROC analysis of the FFADL-ARPDC system is exposed on 80:20 of the TR set/TS set. The figure described that the FFADL-ARPDC methodology led to superior values of ROC. Afterwards, it can be clear that the FFADL-ARPDC approach achieves greater values of ROC on 14 classes.

E1KOBZ_2024_v18n4_959_14_f0002.png 이미지

Fig. 9. ROC of FFADL-ARPDC system on 80:20 of TR set/TS set

In Table 4 and Fig. 10, rice pest detection and classification outcomes of the FFADL-ARPDC methodology on 70:30 of the TR set/TS set are highlighted. The results imply that the FFADL-ARPDC system accomplishes effectual outcomes under all classes. On a 70% TR set, the FFADL-ARPDC technique offers an average accuy of 95.87%, precn of 72.33%, racal of 62.97%, and Fscore of 65.87%. Simultaneously, on the 30% TR set, the FFADL-ARPDC method offers an average accuy of 95.58%, precn of 68.21%, recal of 60.86%, and Fscore of 62.92%.

Table 4. Rice pest detection and classification outcome of FFADL-ARPDC method on 70:30 of TR set/TS set

E1KOBZ_2024_v18n4_959_15_t0001.png 이미지

E1KOBZ_2024_v18n4_959_15_f0001.png 이미지

Fig. 10. Average of FFADL-ARPDC method on 70:30 of TR set/TS set​​​​​​​

Fig. 11 shows the training accuracy TR_accuy and VL_accuy of the FFADL-ARPDC technique at 70:30 of the TR set/TS set. The TL_accuy is determined by the evaluation of the FFADL-ARPDC technique on the TR dataset whereas the VL_accuy is computed by evaluating the performance on a separate testing dataset. The outcomes exhibit that TR_accuy and VL_accuy increase with an upsurge in epochs. Therefore, the performance of the FFADL-ARPDC technique gets improved on the TR and TS dataset with a rise in the number of epochs.

E1KOBZ_2024_v18n4_959_16_f0001.png 이미지

Fig. 11. Accuy curve of FFADL-ARPDC methodology on 70:30 of TR set/TS set

In Fig. 12, the TR_loss and VR_loss results of the FFADL-ARPDC method on 70:30 of the TR set/TS set are shown. The TR_loss defines the error among the predictive performance and original values on the TR data. The VR_loss represent the measure of the performance of the FFADL-ARPDC system on individual validation data. The results indicate that the TR_loss and VR_loss tend to reduce with rising epochs. It depicted the enhanced performance of the FFADL-ARPDC method and its capability to generate accurate classification. The reduced value of TR_loss and VR_loss demonstrates the improved performance of the FFADL-ARPDC method in capturing patterns and relationships.

E1KOBZ_2024_v18n4_959_16_f0002.png 이미지

Fig. 12. Loss curve of FFADL-ARPDC system on 70:30 of TR set/TS set​​​​​​​

A comprehensive PR curve of the FFADL-ARPDC method is depicted at 70:30 of the TR set/TS set in Fig. 13. The simulation value stated that the FFADL-ARPDC methodology outcomes in higher values of PR. Moreover, the FFADL-ARPDC system can reach maximum PR values on 14 classes.

E1KOBZ_2024_v18n4_959_17_f0001.png 이미지

Fig. 13. PR curve of FFADL-ARPDC system on 70:30 of TR set/TS set​​​​​​​

In Fig. 14, a ROC analysis of the FFADL-ARPDC system is shown at 70:30 of the TR set/TS set. The outcome value defined that the FFADL-ARPDC system led to enhanced ROC outcomes. The FFADL-ARPDC approach shows improved ROC values on 14 classes.

E1KOBZ_2024_v18n4_959_17_f0002.png 이미지

Fig. 14. ROC of FFADL-ARPDC system on 70:30 of TR set/TS set​​​​​​​

The comparative pest detection results of the FFADL-ARPDC technique are demonstrated in Table 5 and Fig. 15 [25, 26].

Table 5. Comparative outcome of FFADL-ARPDC approach with recent approaches​​​​​​​

E1KOBZ_2024_v18n4_959_17_t0001.png 이미지

E1KOBZ_2024_v18n4_959_18_f0001.png 이미지

Fig. 15. Comparative outcome of FFADL-ARPDC approach with recent approaches​​​​​​​

The outcomes exhibited that the VGG16, MobileNet, ShuffleNet V2, CTPDR-ECAAM, and CA+INCV techniques offered poor performance. Along with that, the ResNet50 and Leaky Relu techniques have provided slightly boosted results. Nevertheless, the FFADL-ARPDC technique illustrated better results with precn of 83.17%, recal of 77.43%, Fscore of 79.55%, and accuy of 97.58%.

In Table 6 and Fig. 16, comparative computation time (CT) results of the FFADL-ARPDC technique are provided. The results indicate that the FFADL-ARPDC technique offers effectual performance with a minimal CT of 0.89s. On the other hand, the existing ResNet50, VGG16, MobileNet, ShuffleNet V2, CTPDR-ECAAM, Leaky Relu, and CA+INCV techniques obtain maximum CT values of 2.02s, 2.48s, 1.08s, 2.43s, 2.53s, 1.07s, and 1.45s respectively.

Table 6. CT outcome of FFADL-ARPDC approach with recent approaches​​​​​​​

E1KOBZ_2024_v18n4_959_18_t0001.png 이미지

E1KOBZ_2024_v18n4_959_18_f0002.png 이미지

Fig. 16. CT outcome of FFADL-ARPDC approach with recent approaches​​​​​​​

The FFADL-ARPDC approach exemplifies extensive progress in the domain of agricultural image analysis, especially for the classification of rice pests from plant images. By integrating the NASNetLarge DL structure for feature extractor, the FFA for modifying model parameters, and the ERNN for particular pest identification, this method exposes a unique synergy of cutting-edge methods. The employment of FFA ensures optimum solution by adjusting the model to the intricacies of agricultural analytics, improving its robustness and accuracy. The FFADL-ARPDC approach with its combined components not only addresses the challenges of rice pest classification among them paves the way for developments in tailored performances for agricultural image-based tasks, and presents appreciated insights into the optimizer of DL approaches for real-world agricultural applications. Thus, the FFADL-ARPDC algorithm ensured optimum performance with other approaches.

5. Conclusion

In this study, we focus on the design and development of automated rice pest detection and classification using the FFADL-ARPDC technique. The major intention of the FFADL-ARPDC model is to exploit the DL model with a hyperparameter tuning strategy for the identification and categorization of rice pests into different classes. To accomplish this, the FFADL-ARPDC method encompasses pre-processing, NASNetLarge feature extraction, FFA-based hyperparameter tuning, and ERNN-based classification. To optimize the model performance of the NASNetLarge, the FFA is utilized for the hyperparameter tuning process, which helps in improving the classification performance. The FFADL-ARPDC approach is thoroughly evaluated using IP102-dataset from the Kaggle repository. The FFADL-ARPDC model is compared against traditional detection methods, demonstrating its superior performance in accurately detecting and classifying pests. In future, a feature fusion approach can be derived to improve the performance of the pest detection process.

Acknowledgement

We, Hussain. A and Balaji Srikaanth. P state that the content of this article entitled “Leveraging Deep Learning and Farmland Fertility Algorithm for Automated Rice Pest Detection and Classification Model” does not contain any conflict of interest.

References

  1. S. Jain et al., "Automatic Rice Disease Detection and Assistance Framework Using Deep Learning and a Chatbot," Electronics, vol. 11, no. 14, p. 2110, Jul. 2022.
  2. Y. Hu, X. Deng, Y. Lan, X. Chen, Y. Long, and C. Liu, "Detection of Rice Pests Based on SelfAttention Mechanism and Multi-Scale Feature Fusion," Insects, vol. 14, no. 3, p. 280, Mar. 2023.
  3. A. A. J. V. Priyangka and I. M. S. Kumara, "Classification Of Rice Plant Diseases Using the Convolutional Neural Network Method," Lontar Komputer : Jurnal Ilmiah Teknologi Informasi, vol. 12, no. 2, pp. 123-129, Aug. 2021.
  4. P. Tejaswini, P. Singh, M. Ramchandani, Y. K. Rathore, and R. R. Janghel, "Rice Leaf Disease Classification Using Cnn," IOP Conference Series: Earth and Environmental Science, vol. 1032, no. 1, p. 012017, Jun. 2022.
  5. S. I. Hassan, M. M. Alam, U. Illahi, and M. Mohd Suud, "A new deep learning-based technique for rice pest detection using remote sensing," PeerJ Computer Science, vol. 9, p. e1167, Mar. 2023.
  6. V. K. Shrivastava, M. K. Pradhan, and M. P. Thakur, "Application of Pre-Trained Deep Convolutional Neural Networks for Rice Plant Disease Classification," in Proc. of 2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS), Mar. 2021.
  7. K. Abdalgader and J. H. Yousif, "Agricultural Irrigation Control using Sensor-enabled Architecture," KSII Transactions on Internet and Information Systems, vol. 16, no. 10, pp. 3275- 3298, 2022.
  8. R. Sharma, V. Kukreja, and V. Kadyan, "Hispa Rice Disease Classification using Convolutional Neural Network," in Proc. of 2021 3rd International Conference on Signal Processing and Communication (ICPSC), May 2021.
  9. H. Kang et al., "A Novel Deep Learning Model for Accurate Pest Detection and Edge Computing Deployment," Insects, vol. 14, no. 7, p. 660, Jul. 2023.
  10. H. T. Ung, H. Q. Ung, T. T. Nguyen, and B. T. Nguyen, "An Efficient Insect Pest Classification Using Multiple Convolutional Neural Network Based Models," New Trends in Intelligent Software Methodologies, Tools and Techniques, vol. 355, pp. 584-595, Sep. 2022.
  11. H. Gong et al., "Based on FCN and DenseNet Framework for the Research of Rice Pest Identification Methods," Agronomy, vol. 13, no. 2, p. 410, Jan. 2023.
  12. C. R. Rahman et al., "Identification and recognition of rice diseases and pests using convolutional neural networks," Biosystems Engineering, vol. 194, pp. 112-120, Jun. 2020.
  13. K. Thenmozhi and U. Srinivasulu Reddy, "Crop pest classification based on deep convolutional neural network and transfer learning," Computers and Electronics in Agriculture, vol. 164, p. 104906, Sep. 2019.
  14. P. B. and M. Akila, "IoT-based pest detection and classification using deep features with enhanced deep learning strategies," Engineering Applications of Artificial Intelligence, vol. 121, p. 105985, May 2023.
  15. V. Malathi and M. P. Gopinath, "Classification of pest detection in paddy crop based on transfer learning approach," Acta Agriculturae Scandinavica, Section B - Soil & Plant Science, vol. 71, no. 7, pp. 552-559, Feb. 2021.
  16. A. B. Kathole, K. N. Vhatkar, and S. D. Patil, "IoT-Enabled Pest Identification and Classification with New Meta-Heuristic-Based Deep Learning Framework," Cybernetics and Systems, vol. 55. no. 2. pp. 380-408, 2024.
  17. E. Ayan, H. Erbay, and F. Varcin, "Crop pest classification with a genetic algorithm-based weighted ensemble of deep convolutional neural networks," Computers and Electronics in Agriculture, vol. 179, p. 105809, Dec. 2020.
  18. H. Peng et al., "Crop pest image classification based on improved densely connected convolutional network," Frontiers in Plant Science, vol. 14, Apr. 2023.
  19. B. Goyal, A. Dogra, S. Agrawal, B. S. Sohi, and A. Sharma, "Image denoising review: From classical to state-of-the-art approaches," Information Fusion, vol. 55, pp. 220-244, Mar. 2020.
  20. M. Hayati et al., "Impact of CLAHE-based image enhancement for diabetic retinopathy classification through deep learning," Procedia Computer Science, vol. 216, pp. 57-66, 2023,
  21. M. Mehmood et al., "Improved colorization and classification of intracranial tumor expanse in MRI images via hybrid scheme of Pix2Pix-cGANs and NASNet-large," Journal of King Saud University - Computer and Information Sciences, vol. 34, no. 7, pp. 4358-4374, Jul. 2022.
  22. F. S. Gharehchopogh and H. Shayanfar, "Automatic Data Clustering Using Farmland Fertility Metaheuristic Algorithm," in Advances in Swarm Intelligence, Oct. 2022, pp. 199-215.
  23. N. P. Kumar, S. Vijayabaskar, L. Murali, and K. Ramaswamy, "Design of optimal Elman Recurrent Neural Network based prediction approach for biofuel production," Scientific Reports, vol. 13, no. 1, May 2023.
  24. Kaggle Dataset.
  25. Z. Li, X. Jiang, X. Jia, X. Duan, Y. Wang, and J. Mu, "Classification Method of Significant Rice Pests Based on Deep Learning," Agronomy, vol. 12, no. 9, p. 2096, Sep. 2022.
  26. H. Ni et al., "Classification of Typical Pests and Diseases of Rice Based on the ECA Attention Mechanism," Agriculture, vol. 13, no. 5, p. 1066, May 2023.