DOI QR코드

DOI QR Code

Efficient Forest Fire Detection using Rule-Based Multi-color Space and Correlation Coefficient for Application in Unmanned Aerial Vehicles

  • Anh, Nguyen Duc (Department of Fire Engineering and Technology, University of Fire Prevention and Fighting) ;
  • Van Thanh, Pham (Department of Fire Engineering and Technology, University of Fire Prevention and Fighting) ;
  • Lap, Doan Tu (Fire Inspection Division, Vietnam Fire and Rescue Police Department) ;
  • Khai, Nguyen Tuan (Department of Electrical/Electronic and Computer Engineering, University of Ulsan) ;
  • Van An, Tran (Faculty of Basic Sciences and Foreign Languages, University of Fire Prevention and Fighting) ;
  • Tan, Tran Duc (Faculty of Electrical and Electronic Engineering, Phenikaa University) ;
  • An, Nguyen Huu (Department of Fire Engineering and Technology, University of Fire Prevention and Fighting) ;
  • Dinh, Dang Nhu (Department of Fire Engineering and Technology, University of Fire Prevention and Fighting)
  • Received : 2021.02.08
  • Accepted : 2021.12.19
  • Published : 2022.02.28

Abstract

Forest fires inflict great losses of human lives and serious damages to ecological systems. Hence, numerous fire detection methods have been proposed, one of which is fire detection based on sensors. However, these methods reveal several limitations when applied in large spaces like forests such as high cost, high level of false alarm, limited battery capacity, and other problems. In this research, we propose a novel forest fire detection method based on image processing and correlation coefficient. Firstly, two fire detection conditions are applied in RGB color space to distinguish between fire pixels and the background. Secondly, the image is converted from RGB to YCbCr color space with two fire detection conditions being applied in this color space. Finally, the correlation coefficient is used to distinguish between fires and objects with fire-like colors. Our proposed algorithm is tested and evaluated on eleven fire and non-fire videos collected from the internet and achieves up to 95.87% and 97.89% of F-score and accuracy respectively in performance evaluation.

Keywords

1. Introduction

In recent decades, the forest fire has never ceased to be a severe threat to the ecosystem, not only depleting one of the significant resources for human life, but also disrupting the local natural structure. According to the Vietnam Administration of Forestry, there are thousands to hundreds of thousands of forest fire hotspots each year in Vietnam [1], as shown in Fig. 1.

E1KOBZ_2022_v16n2_381_f0001.png 이미지

Fig. 1. The Statistics of Forest Fire Hotspots from 2011 to 2019 in Vietnam [1]

Based on Fig. 1, the number of forest fire hotspots has been on the increase over the years in Vietnam. Only 8, 469 forest fire hotspots were recorded in 2011, but the figure dramatically increased to 436, 005 in 2019. Furthermore, Fig. 2 shows that three consecutive months (from February to April) witnessed the highest number of hotspots in that year and forests with high fire risk levels (level 3 to level 5) were scatteredly distributed throughout the country in 2020 [1]. It raises a big concern that forest fires can occur at anytime and anywhere.

E1KOBZ_2022_v16n2_381_f0002.png 이미지

Fig. 2. The Statistics of Forest Fire Hotspots in 12 months in 2019 in Vietnam [1]

In June 2019, a huge forest fire occurred in Ha Tinh province, Vietnam destroying more than 65 hectares of forest (as shown in Fig. 3) of which 70% is naturally irreversible [2].

E1KOBZ_2022_v16n2_381_f0003.png 이미지

Fig. 3. The forest fire occurring in Ha Tinh province, Vietnam, in June 2019 [3]-[5].

For indoor fire detection, a large proportion of alarm systems are sensor-based: infrared sensor, optical sensor, etc. which, if used alone, can only show whether there is a fire. Other information like fire scale, fire parameters such as temperature, smoke concentration and edge cannot be obtained in such way. For outdoor fires, especially forest fire, sensor-based systems have become insufficient for the growingly demanding requirements.

One of the most popular approaches recently for forest fire detection is rule-based color space due to its wide detection range, fast response time, high flexibility, and low cost. Many studies on fire detection using traditional method through image/video processing have been conducted [6]-[11], [15], [17]-[20]. However, the majority of them are either kept and used for a sole organization, a government, etc. or published under a complete commercial system/application without revealing their algorithms. Still, some notable works have been published in recent years, and a large number of them are based on color pixel recognition and/or motion detection. In 2003, T. H. Chen et al. [6] proposed an intelligent real-time video processing method in RGB space with low computational complexity. In this approach fire pixels are extracted based on color features, which yields a promising result in early fire detection. However, T. H. Chen's approach is still prone to uncertainties such as fire-color objects, and requires limited camera mobility to maintain its accuracy. His next study in 2004 [7] indicates improvements by using flame difference measured between two consecutive frames. B. Uğur Töreyinet al.'s works in 2006 [8], 2007 [9] which were based on temporal, spatial modeling of flames and representation of fire boundary in the wavelet domain gave fruitful results. In 2009, Celik and Demirel [10] proposed using YCbCr color space due to its illumination advantage over RGB color. This new approach diminishes the problem of underperformance when illumination changes in RGB space.

Another popular approach is using deep learning methods widely applied in forest fire detection and the obtained experimental results seem to demonstrate the great potential in forest fire detection [22].

In study [23], the authors proposed the forest fire detection algorithm using the aerial images from UAV and YOLOv3 algorithm. The image acquisition is performed by a built-in visible and infrared camera on UAV. Then, the onboard computer carried by UAV with embedded YOLOv3-tiny algorithm can perform local image processing and mission planning. The relevant results are transmitted to the ground station for detecting and diagnosing forest fire. In [24], the author proposed three main deep neural networks for fire detection, which are comprised of Adaboost-MLP, Adaboost-LBP and convolutional neural network (CNN). For the Adaboost-MLP, sensors data are used for fire forecast, and Adaboost-LBP model was combined with CNN model for fire detection based on the images taken from surveillance cameras. This proposed method achieved the accuracy of over 99% but it requires a huge amount of computation.

Pan et al. [25] proposed an additive deep neural network (Addnet) for wildfire detection which is based on multiplication-free vector operator with the images taken from surveillance cameras and the internet. The use of Addnet can save time in comparison with convolutional neural network (CNN). However, the high resolution images still require a long duration of time for computation.

Tung Xuan Truong et al. [11] utilized a support vector machine algorithm with fire-colored regions being segmented from moving regions and particular parameters being extracted from the fire-region's tempo-spatial characteristics to yield an effective fire detection method with a low false alarm rate, but this method reveals limitation in distinguishing several kinds of fire-like color objects such as a flying red flag.

The fire detection including forest fire in early stage is significant, since it can reduce the potential damages to the ecosystem as well as losses of human lives. The forest fire detection based on deep learning methods may create delay in processing because they require huge amount of computation and excessively consume energy. This may cause the fire to spread or even grow out of control. Therefore, the traditional methods adopted in this study are more suitable to be applied in forest fire detection using UAVs because it can respond in real-time with low computation compared with deep learning methods.

Given the aforementioned limitations, we propose a novel algorithm to combine two color spaces including RGB and YCbCr and correlation coefficient between two consecutive frames. Firstly, we propose to apply four conditions to detect and recognize the flame area. Then, the image will be converted to binary color, with white pixels representing fire-colored pixels and black for others. The correlation coefficient will be applied to eliminate the non-fire pixel. The paper is organized as follows: After presenting the forest fire trending in Vietnam, the related works and the shortcomings of these publications, and the novel contributions of this research in section 1. Section 2 discusses the design of our proposed algorithm. The experimental results and relevant problems are discussed in section 3. Finally, the conclusion of this paper and suggested further development of our research are given in section 4.

2. Materials and method

Forest fires can be caused by a variety of factors and they tend to prevail in dry season. In this season, plants are more combustible. During the day time, fire spreads faster than during night time due to the higher temperature and (usually) lower humidity. Besides, the terrain also affects the speed of fire spread; for example, a fire spreads more quickly up than down a slope. Other factors that must be considered are wind, air convection and fuel types [12]-[13]. Therefore, when a forest fire happens, it can be easily detected from above through smoke and fire colors.

2.1 Forest fire Detection in RGB Images

2.1.1 Sign identification

A forest fire can be detected via various signs: heat rise, fire sight, smoke color, etc. Nevertheless, in this research we mainly focus on the properties of fire color. Fire color is chiefly determined by the involved material and the temperature. Through analysis, alienation and separation, it can be used as a productive aid for fire recognition and identification. Table 1 below presents a list of fire and smoke colors based on different materials [14]:

Table 1. The list of fire and smoke colors based on the materials [14]

E1KOBZ_2022_v16n2_381_t0001.png 이미지

As can be seen from Table 1, fire color is mostly in red to the yellow band with the material of wood. Hence, the forest fire color is mainly in this range. We can also see that the RGB image of a forest fire image can be divided into R, G, and B as in Fig. 4:

E1KOBZ_2022_v16n2_381_f0017.png 이미지

Fig. 4. a) The original RGB images and its channels: b) R channel, c) G channel, and d) B channel.

It is obvious that the fire region can be recognized with the highest intensity in R channel. The intensity of the fire pixels decreases significantly in G and especially B. These findings are consistent with the results of T. H. Chen et al.'s work [6], i.e. a fire region in the RGB image can be detected where R ≥ G & G > B. Nevertheless, non-fire yellow-red pixels in the image (e.g. a fireman wearing red or yellow outfits, red fire trucks) can be wrongly detected as fire pixels. Thus, the forest fire detection in our proposed algorithm is the combination of multi-conditions in different color spaces.

2.1.2 Forest fire detection algorithm in RGB image

As mentioned above, our study only focuses on forest fires. Thus, the targeted color range is from red to yellow. For a pixel (x, y) to be an image fire pixel, it must satisfy the following condition (condition 1):

\(\mathrm{P}(\mathrm{x}, \mathrm{y})=\left\{\begin{array}{c} 1, \text { if } \mathrm{R}(\mathrm{x}, \mathrm{y})>G(x, y)>B(x, y) \\ 0, \text { Otherwise } \end{array}\right.\)       (1)

Based on the analysis, we tested the condition (1) and obtained the result as in Fig. 5 below:

E1KOBZ_2022_v16n2_381_f0004.png 이미지

Fig. 5. The results of applying the condition (1)

The result shows that the condition 1 is vague, which caused those pixels with the same color range to be incorrectly passed as in the white color inside the red line in Fig. 5. Based on the achieved results in Fig. 5 can be seen that the using condition 1 only is not strong enough in forest fire detection. Hence, we proposed to combine conditions 1 and 2 to solve the limitations in condition 1. The details of condition 2 are as follows:

A fire image in R, G, B channels has been analyzed as in Fig. 6. Based on the coordinates of flame area in the original image in Fig. 6a, we can recognize the fire pixels in R, G, and B channels as shown in the red circle in Fig. 6b, Fig. 6c and Fig. 6d. It is plain to see that the fire pixels in the original image is dependent on the specific channel. It is concluded that a pixel considered a part of the fire zone needs to satisfy the following condition (condition 2):

E1KOBZ_2022_v16n2_381_f0005.png 이미지

Fig. 6. The analysis of fire image in R, G, B channels: a) Original image, b) R channel, c) G channel, d) B channel

\(P(x, y)=\left\{\begin{array}{c} 1, \text { if }\left(R_{(x, y)}>R_{R}\right) \cap\left(G_{(x, y)}>R_{G}\right) \cap\left(B_{(x, y)}       (2)

where, RR, RG, RB are threshold-analys0is, Ocotheeffriwciiesne ts of different fire forms. Based on the analysis and experimental results, we chose the optimal threshold values of RR, RG, RB for different forest fire and non-fire images as Table 2 below:

Table 2. The threshold values of RR, RG, and RB

E1KOBZ_2022_v16n2_381_t0002.png 이미지

With the combination of (1) and (2), we tested again and obtained the result as in Fig. 7:

E1KOBZ_2022_v16n2_381_f0006.png 이미지

Fig. 7. The results of applying the condition (1) and condition (2).

It is generally perceived that this combination overcomes some errors caused by pixels with the similar color to that of a fire; however, in the fourth image (Fig. 7b), it can be seen that areas with high illumination are still mistaken while fire-like color objects were removed in the achieved results as in Fig. 7a and Fig. 7c. In our study, we propose to use both conditions mentioned above, along with the detection algorithm in YCbCr Color Spaces.

2.2 Forest fire detection in YCbCr Space

2.2.1 Identification signs

The reason for YCbCr Space to be utilized is that it offers better brightness information from shade than other spaces. The following formula will convert RGB to YCbCr color space [19]:

\(\left[\begin{array}{c} Y \\ C b \\ C r \end{array}\right]=\left[\begin{array}{ccc} 0.2568 & 0.5041 & 0.0979 \\ -0.1482 & -0.2910 & 0.4392 \\ 0.4392 & -0.3678 & -0.0714 \end{array}\right] *\left[\begin{array}{l} R \\ G \\ B \end{array}\right]+\left[\begin{array}{c} 16 \\ 128 \\ 128 \end{array}\right]\)       (3)

We separate an image into Y, Cb and Cr elements as in Fig. 8:

E1KOBZ_2022_v16n2_381_f0007.png 이미지

Fig. 8. An original image separated into Y, Cb, and Cr elements: a) the original RGB image, b) Y channel, c) Cb channel, d) Cr channel respectively.

From our analysis with various images like Fig. 8 above, it can be seen in the example of a fire image pixel at the coordinate of pixel (805, 547), the value of pixels in Y, Cb, Cr channels are 189, 88 and 162 respectively. It means that Y(805, 547) > Cr(805, 547) > Cb(805, 547). Based on the analysis results on other fire image pixels, we also achieve the trend that Y(x, y) ≥ Cr(x, y) > Cb(x, y). Hence, we propose that an image pixel is confirmed as fire image pixel when it satisfies the suggested condition 3 as below:

\(\mathrm{P}(\mathrm{x}, \mathrm{y})=\left\{\begin{array}{c} 1, \text { if } \mathrm{Y}(\mathrm{x}, \mathrm{y}) \geq \operatorname{Cr}(\mathrm{x}, \mathrm{y}) \text { and } \operatorname{Cr}(\mathrm{x}, \mathrm{y}) \geq \operatorname{Cb}(\mathrm{x}, \mathrm{y}) \\ 0, \text { Otherwise } \end{array}\right. \)       (4)

When separated from the original image from RGB color space to YCbCr color space, the fire pixels are "white" color in Y and Cb channels and "black" in Cr channel (See Fig. 8 for more details). The images in Y, Cb, and Cr channels are grayscale images with the values in the range of [0, 255]. The values will create the different shades of gray, and the zero and 255 values are related to black and white, respectively. Therefore, the fire pixel values in Y and Cr channels are more significant than the mean value pixel of Y and Cr components. Furthermore, the fire pixel values in Cb channel are smaller than the mean value pixel of Cb component. The mean values are calculated by the following formula:

\(Y_{\text {mean }}=\frac{1}{N * M} \sum_{x=1}^{N} \sum_{y=1}^{M} Y(x, y)\)       (5)

\(C b_{\text {mean }}=\frac{1}{N * M} \sum_{x=1}^{N} \sum_{y=1}^{M} C b(x, y)\)       (6)

\(C r_{\text {mean }}=\frac{1}{N * M} \sum_{x=1}^{N} \sum_{y=1}^{m} C r(x, y)\)       (7)

where, Ymean, Cbmean, and Crmean are the mean values of Y, Cb, and Cr channels  respectively; and N*M is the total number of pixels. Thus, a pixel is confirmed a fire pixel when it satisfies the following formula (condition 4):

\(\mathrm{P}(\mathrm{x}, \mathrm{y})=\left\{\begin{array}{c} 1, \text { if } \mathrm{Y}(\mathrm{x}, \mathrm{y})>Y_{\text {mean }} \cap \mathrm{Cr}(\mathrm{x}, \mathrm{y})>C r_{\text {mean }} \cap \mathrm{Cb}(\mathrm{x}, \mathrm{y})      (8)

We present the result of combining all four previously mentioned conditions (1), (2), (3), and (4) as in Fig. 9 below:

E1KOBZ_2022_v16n2_381_f0008.png 이미지

Fig. 9. Forest fire detection using conditions (1), (2), (3), and (4)

From the result above, it can be perceived that uncertainties from high-illumination images such as sunlight and heat-affected areas are negligible. However, with images whose subjects have the same color as a fire’s such as images taken of painting with fire, sunflower hill are not recognized as the achieved result in Fig. 10b. To tackle this weakness, we propose using image correlation function so that we can compare consecutive fire-colored images extracted from the camera recording the forest fire. Our suggestion is to exploit the properties of fire such as its shape, size which vary in images at different times.

E1KOBZ_2022_v16n2_381_f0009.png 이미지

Fig. 10. Forest fire detection using conditions (1), (2), (3), and (4)

2.3 The correlation coefficient

As mentioned above, this method is based on the change of fire. Without this, fire-colored subjects can be incorrectly detected, such as a picture of fire and sunlight. It immensely increases the accuracy of fire recognition.

2.3.1 Identification signs

After using the proposed conditions in the color space models, the proposed algorithm still shows the limitations on identifying the objects having similar color to a fire’s. The flickering properties are the characteristic of the burning process as a result of airflow exchange. The shape and size of the flame are changeable during this process. In Fig. 11, we show frames from unmanned aerial vehicles (UAVs) incorporating a camera to record a forest fire with the time difference of 0.2s in-between. It can be seen that the flickering properties are shown according to the change of height and scale around the fire base.

E1KOBZ_2022_v16n2_381_f0010.png 이미지

Fig. 11. Flickering effect caused by the change of fire shape and size.

2.3.2 Correlation coefficient between consecutive frames

The correlation coefficient is determined by comparing consecutive frames Fk and Fk-1 after applying a processing algorithm to identify fire color properties. The frames are now in binary color, with white pixels being fire-colored pixels and black for others.

The purpose of this method is to compare consecutive frames which represent a fire as conditions (1), (2), and (3) (4) are satisfied. From that basis, we find a set of optimal correlation coefficients 𝑟 that can both distinguish the fire flickering properties and diminish the UAVs camera shaking effect simultaneously.

According to [16], the correlation coefficient between 2 frames A and B is calculated as follows:

\(r=\frac{\sum_{m} \sum_{n}\left(A_{m n}-\bar{A}\right)\left(B_{m n}-\bar{B}\right)}{\sqrt{\sum_{m} \sum_{n}\left(A_{m n}-\bar{A}\right)^{2} \sum_{m} \sum_{n}\left(B_{m n}-\bar{B}\right)^{2}}}\)       (9)

where, A and B are binary images after using four proposed conditions (1), (2), (3), and (4); Amn, Bmn are the pixel values at (m, n) in A, B; \(\overline{\mathrm{A}}, \overline{\mathrm{B}}\) are the mean pixel values of A, B; m, n are matrix size. The Fig. 12 presents the flowchart of our proposed algorithm. It can be clearly seen that when an image has satisfied conditions 1, 2 in RGB and 3, 4 in YCbCr color spaces, it will be confirmed as that of a fire and saved in stack. Then, the next image will be processed if it has also satisfied all four proposed conditions as above. The correlation coefficient between these images will be calculated for final confirmation. In order to propose the correlation coefficient threshold, we have collected two set of videos including forest fire and non-fire videos. Then, four proposed conditions and correlation coefficient will be applied. The correlation coefficient threshold is achieved based on the statistical results.

E1KOBZ_2022_v16n2_381_f0011.png 이미지

Fig. 12. Our proposed fire detection algorithm

The final confirmation that a fire has occurred is made when the correlation coefficient value is in the range of the proposed correlation coefficient threshold; otherwise, it is confirmed that there is no fire.

3. Results and Discussion

The experimental results of our proposed method have been presented in this part. The proposed method and the comparison methods are implemented using the MATLAB environment (R2016a) on the processor of Intel(R) Core(TM) i5-3337U CPU @ 1.8 GHz and 4 GB RAM.

Eleven videos were collected from the Internet [26]-[36] to measure and evaluate our proposed algorithm performance. These videos were separated into 18 and 30 sets of fire frames and non-fire frames with the time between two consecutive frames is 0.2s. Each set contains a different scenario of fire and non-fire in the real. We randomly selected 9 sets of fire frames and 16 sets of non-fire frames to measure and determine the threshold values of our proposed method. The detail of these sets are presented in the Table 3.

Table 3. The details of datasets for experimental testing and performance evaluation

E1KOBZ_2022_v16n2_381_t0003.png 이미지

Apart from detecting forest fire, our proposed method highly focuses on eliminating objects with fire-like colors such as the sun, environment noises. As a result, a larger number of non-fire frames and non-fire scenarios were used in this research. The correlation coefficient significantly improves our proposed forest fire method in eliminating non-fire frames and fire-like objects as well.

3.1 The experimental testing

The results of the algorithm applied to consecutive frames and with different correlation coefficient is shown in Fig. 13.

E1KOBZ_2022_v16n2_381_f0012.png 이미지

Fig. 13. The correlation coefficients between two consecutive frames

The correlation coefficient of two consecutive frames with and without fire recorded by UAVs camera in Fig. 13 equals 0.2691 and 0.9276 respectively. Basing only on four proposed conditions in RGB and YCbCr as above fire-like color objects as shown in Fig. 13b cannot be eliminated. In order to solve the limitations of the proposed conditions in RGB and YCbCr color spaces, we applied the correlation coefficient algorithm as proposed above. Nevertheless, selecting a suitable range of correlation coefficient threshold is extremely important because a wide range can increase both true positive detection in fire frames and the negative detection in non-fire frames. The Fig. 14 presents the correlation coefficient values of 9 sets of fire frames under box plotting and the Table 4 is the range of correlation coefficient of these sets. It can be easily seen that the correlation coefficient values of the 9 sets of fire frames in the experimental testing are mainly in the range of 0.073 to 0.82.

E1KOBZ_2022_v16n2_381_f0013.png 이미지

Fig. 14. The correlation coefficient of 9 sets with fire frames

Table 4. The correlation coefficient values of 9 sets with fire frames

E1KOBZ_2022_v16n2_381_t0004.png 이미지

Fig. 15 illustrates the correlation coefficient in box of the 16 sets of non-fire frames and the correlation coefficient values of these sets shown in Table 5. The correlation coefficient values of non-fire frame sets are dependent on the specific objects. For example, the set 7 of non-fire frames contains vegetation in the frames with fire-like color and the correlation coefficient values of this set are NaN (Not-a-Number) and the range of 0.0038 ~ 0.0694 (median 0.0191). The set 11 of non-fire frames contains the sun in the frames with the correlation coefficient values are NaN and 0.574 ~ 0.978 (median 0.873).

E1KOBZ_2022_v16n2_381_f0014.png 이미지

Fig. 15. The correlation coefficient of 16 sets with non-fire frames

Table 5. The correlation coefficient values of 16 sets with non-fire frames

E1KOBZ_2022_v16n2_381_t0005.png 이미지

The result of correlation coefficients calculation between consecutive fire frames with time difference = 0.2s is also shown in Table 4. It can be asserted from the testing result in Table 4 that the correlation coefficients between two consecutive fire frames of the video recorded from moving UAVs are in the range from 0.0695 to 0.82. In the absence of a fire, the correlation coefficient values are NaN (Not-a-Number), smaller than 0.0695 and greater than 0.82. Hence, based on the experimental results, we propose 𝑅𝑡 and 𝑅𝑠 values to be 0.0695 and 0.82 respectively (time difference between consecutive frames = 0.2s) when the camera is mounted on a moving UAV.

Fig. 16 and Fig. 17 show the test results applying conditions 1, 2, 3, 4 with and without using correlation coefficient (condition 5). The test results indicate that our proposed method achieved nearly 100% in forest fire detection in 9 sets of fire frames, especially the application of correlation coefficient results in significant improvement in the performance of our proposed method in comparison with those that do not use it such as the set number 6, 7 and 11 of non-fire frames as shown in Fig. 17.

E1KOBZ_2022_v16n2_381_f0015.png 이미지

Fig. 16. The achieved accuracy with and without using the correlation coefficient of 9 sets with fire frames

E1KOBZ_2022_v16n2_381_f0016.png 이미지

Fig. 17. The achieved accuracy with and without using the correlation coefficient of 16 sets with non-fire frames

3.2 Performance Evaluation

To evaluate the performance of our proposed algorithm, we will measure it based on four parameters include recall, precision, F-score and Accuracy.

\(\text { recall }=\frac{T P}{T P+F N}\)       (10)

\(\text { precision }=\frac{T P}{T P+F P} \)       (11)

\(F=2 * \frac{\text { precision } * \text { recall }}{\text { precision }+\text { recall }}\)       (12)

\(\text { Accuracy }=\frac{T P+T N}{T P+T N+F P+F N}\)       (13)

where, F refers to F-score used to 𝑇ev𝑃al+ua𝑇te𝑁 t+he 𝐹p𝑃er+for𝐹m𝑁 ance of our proposed method, True positive (TP) factor determines if an image is a fire image and the proposed algorithm can detect it; False Positive (FP) factor determines if an image is not a fire image and the proposed algorithm identified as a fire image; True Negative (TN) factor determines if an image is a fire-like image and the proposed algorithm verifies that it is not a fire image, and False Negative (FN) factor determines if an image is a fire image and the proposed algorithm cannot detect it [21].

Table 6 illustrates the comparison among our proposed method with the correlation coefficient, our proposed method without the correlation coefficient and Chen et al’s [7]. It can be clearly seen that the proposed method using the correlation coefficient achieved the highest performance with 95.87% of F-score and 97.89% of accuracy respectively.

Table 6. Evaluations of the three tested fire detection methods

E1KOBZ_2022_v16n2_381_t0006.png 이미지

The achieved accuracy by our proposed algorithm and datasets were public in the source: https://github.com/Pham-Van-Thanh/Forest-Fire-Detection

4. Conclusion

In this paper, we combine fire detection techniques in RGB and YCbCr color spaces based on fire properties. We also propose using the correlation coefficient between consecutive frames to eliminate objects with fire-like color and diminish the vehicle shaking effect to ensure the accuracy of the proposed method. From the achieved results, we will propose the algorithm for edge separation in order to calculate flight trajectory to support firefighting along the edge of the fire zone in our future research. Also, we plan to develop a model to simulate the effect of wind speed, flight speed, and height on the dropping of extinguishing agents and flight trajectory using ANSYS/CFD for automatic firefighting by helicopter or airplane instead of using manual firefighting by firefighters.

Acknowledgement

This work was supported by the Domestic Master/ PhD Scholarship Programme of Vingroup Innovation Foundation.

References

  1. The Statistics of Forest Fire Hotspots in Vietnam, Accessed on: May. 26, 2020. [Online]. Available: http://firewatchvn.kiemlam.org.vn/thong-ke
  2. The forest fire occurring in Ha Tinh province, Vietnam, Accessed on: May. 26, 2020. [Online]. Available: https://thanhtra.com.vn/xa-hoi/moi-truong/Vu-chay-rung-kinh-hoang-tai-Ha-Tinh-gay-thiet-haiden-65-ha-rung-150735.html
  3. The forest fire occurring in Ha Tinh province, Vietnam, Accessed on: May. 26, 2020. [Online]. Available: https://thanhnien.vn/thoi-su/chay-rung-du-doi-tai-ha-tinh-1098288.html
  4. The forest fire occurring in Ha Tinh province, Vietnam, Accessed on: May. 26, 2020. [Online]. Available: https://vnexpress.net/vi-sao-dam-chay-rung-o-ha-tinh-nhieu-lan-bung-phat-lai-3946484.html
  5. The forest fire occurring in Ha Tinh province, Vietnam, Accessed on: May. 26, 2020. [Online]. Available: https://www.24h.com.vn/tin-tuc-trong-ngay/nghi-pham-dot-rac-gay-chay-rung-kinh-hoang-o-hatinh-doi-dien-muc-an-nao-c46a1063005.html
  6. T. H. Chen, C. L. Kao and S. M. Chang, "An Intelligent Real-Time Fire-Detection Method Based on Video Processing," in Proc. of IEEE 37th Annual 2003 International Carnahan Conference on Security Technology, Taipei, Taiwan, pp. 104-111, 2003.
  7. T. Chen, P. Wu and Y. Chiou, "An early Fire-detection Method Based on Image Processing," in Proc. of the IEEE International Conference on Image Processing (ICIP), Singapore, pp. 1707-1710, 2004.
  8. W. B. Toreyin, Y. Dedeoglu, U. Gudukbay and A. E. Cetin, "Computer vision-based method for real-time fire and flame detection," Pattern Recognition Letters, vol. 27, no. 1. pp.49-58, 2006.
  9. B. U. Toreyin and A. E. Cetin, "Online Detection of Fire in Video," in Proc. of 2007 IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, pp. 1-5, 2007.
  10. T. Celik and H. Demirel, "Fire detection in video sequences using a generic color model," Fire Safety Journal, vol. 44, no. 2, pp. 147-158, 2009. https://doi.org/10.1016/j.firesaf.2008.05.005
  11. T. X. Truong and J. M. Kim, "Fire flame detection in video sequences using multi-stage pattern recognition techniques," Engineering Applications of Artificial Intelligence, vol. 25, no. 7, pp. 1365-1372, 2012. https://doi.org/10.1016/j.engappai.2012.05.007
  12. H. Silva and I. H. Leslie, "Geometrically based metrics featuring area and shape for multidimensional wildland fire models," Journal of Fire Sciences, vol. 31, no. 1, pp. 85-96, 2013. https://doi.org/10.1177/0734904112455161
  13. A. Simeoni, Z. C. Owens, E. W. Christiansen, et al, "A preliminary study of wildland fire pattern indicator reliability following an experimental fire," Journal of Fire Sciences, vol. 35, no. 5, pp. 359-378, 2017. https://doi.org/10.1177/0734904117720674
  14. J. D. DeHaan and D. J. Icove, Kirk's Fire Investigation, 7th ed, Pearson, 2011.
  15. W. B. Horng, J-W. Peng, and C-Y. Chen, "A new image based real-time flame detection method using color analysis," in Proc. of 2005 IEEE Networking, Sensing and Control, Tucson, AZ, USA, pp. 100-105, 2005.
  16. K. Briechle and U. D. Hanebeck, "Template matching using fast normalized cross correlation," in Proc. of SPIE 4387, Optical Pattern Recognition XII, Orlando, FL, United States, pp. 95 - 102, 2001.
  17. M. Mahmoud and H. Ren, "Forest Fire Detection Using a Rule-Based Image Processing Algorithm and Temporal Variation," Mathematical Problems in Engineering, vol. 2018, 8 pages, 2018.
  18. H. Cruz, M. Eckert, J. Meneses, J.-F.Martinez, "Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs)," Sensors, vol. 16, no. 6, 2016.
  19. C. E. Premal and S. S. Vinsley, "Image processing based forest fre detection using YCbCr colour model," in Proc. of 2014 International Conference on Circuits, Power and Computing Technologies (ICCPCT-2014), Nagercoil, India, pp. 1229-1237, 2014.
  20. T. Celik, H. Ozkaramanli, H. Demirel, "Fire and smoke detection without sensors: Image processing based approach," in Proc. of 2007 15th European Signal Processing Conference, Poznan, Poland, pp. 1794-1798, 2007.
  21. V. T. Pham, Q. B. Le, D. A. Nguyen, et al, "Multi-Sensor Data Fusion in A Real-Time Support System for On-Duty Firefighters," Sensors, vol. 19, no. 21, 2019.
  22. P. Barmpoutis, P. Papaioannou, K. Dimitropoulos, N. Grammalidis, "A Review on Early Forest Fire Detection Systems Using Optical Remote Sensing," Sensors, vol. 20, no. 22, 2020.
  23. Z. Jiao, Y. Zhang, J. Xin, et al, "A Deep Learning Based Forest Fire DetectionApproach Using UAV and YOLOv3," in Proc. of 2019 1st International Conference on Industrial Artificial Intelligence (IAI), Shenyang, China, pp. 1-5, 2019.
  24. F. Saeed, A. Paul, P. Karthigaikumar, et al, "Convolutional neural network based early fire detection," Multimed Tools Appl, vol. 79, no. 13-14, pp. 9083-9099, 2020. https://doi.org/10.1007/s11042-019-07785-w
  25. H. Pan, D. Badawi, X. Zhang, et al, "Additive neural network for forest fire detection," SIViP, vol. 14, no. 4, pp. 675-682, 2020. https://doi.org/10.1007/s11760-019-01600-7
  26. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.youtube.com/watch?v=iHNKSp_qIDo
  27. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.youtube.com/watch?v=S0rjwY3X7EU
  28. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.youtube.com/watch?v=euj8PU3PRgE
  29. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.youtube.com/watch?v=aOMcPAqa9JI
  30. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.youtube.com/watch?v=mC_TP2Syk7s
  31. Database, Accessed on: Mar. 20, 2020. [Online]. Available: https://www.tomorrowsforests.co.uk/drone-uav.html
  32. Database, Accessed on: Oct. 21, 2021. [Online]. Available: https://www.youtube.com/watch?v=7wLhfpa01LY
  33. Database, Accessed on: Oct. 21, 2021. [Online]. Available: https://www.youtube.com/watch?v=r5Glpdhipvg
  34. Database, Accessed on: Oct. 21, 2021. [Online]. Available: https://www.youtube.com/watch?v=D3BWpoJ6ijs
  35. Database, Accessed on: Oct. 21, 2021. [Online]. Available: https://www.youtube.com/watch?v=SVLE2BsMWSE
  36. Database, Accessed on: Oct. 21, 2021. [Online]. Available: https://www.youtube.com/watch?v=C0OzayuO7rg