DOI QR코드

DOI QR Code

Estimation of Fractional Vegetation Cover in Sand Dunes Using Multi-spectral Images from Fixed-wing UAV

  • Choi, Seok Keun (Dept. of Civil Engineering, Chungbuk National University) ;
  • Lee, Soung Ki (Dept. of Civil Engineering, Chungbuk National University) ;
  • Jung, Sung Heuk (Dept. of Civil Engineering, Chungbuk National University) ;
  • Choi, Jae Wan (Dept. of Civil Engineering, Chungbuk National University) ;
  • Choi, Do Yoen (Research institute, Terrapix) ;
  • Chun, Sook Jin (Dadohaehaesang Marine National Park Western Office)
  • Received : 2016.07.29
  • Accepted : 2016.08.24
  • Published : 2016.08.31

Abstract

Since the use of UAV (Unmanned Aerial Vehicle) is convenient for the acquisition of data on broad or inaccessible regions, it is nowadays used to establish spatial information for various fields, such as the environment, ecosystem, forest, or for military purposes. In this study, the process of estimating FVC (Fractional Vegetation Cover), based on multi-spectral UAV, to overcome the limitations of conventional methods is suggested. Hence, we propose that the FVC map is generated by using multi-spectral imaging. First, two types of result classifications were obtained based on RF (Random Forest) using RGB images and NDVI (Normalized Difference Vegetation Index) with RGB images. Then, the result map was reclassified into vegetation and non-vegetation. Finally, an FVC map-based RF were generated by using pixel calculation and FVC map-based GI (Gutman and Ignatov) model were indirectly made by fixed parameters. The method of adding NDVI shows a relatively higher accuracy compared to that of adding only RGB, and in particular, the GI model shows a lower RMSE (Root Mean Square Error) with 0.182 than RF. In this regard, the availability of the GI model which uses only the values of NDVI is higher than that of RF whose accuracy varies according to the results of classification. Our results showed that the GI mode ensures the quality of the FVC if the NDVI maintained at a uniform level. This can be easily achieved by using a UAV, which can provide vegetation data to improve the estimation of FVC.

Keywords

1. Introduction

FVC (Fractional Vegetation Cover) is generally defined as the ratio of the vertical projection area of vegetation to target area. Existing vegetation indices, such as NDVI (Normalized Difference Vegetation Index), SAVI (Soil-Adjusted Vegetation Index) and AFRI (Aerosol Free vegetation Index), are useful to indicate the activity of the condition of vegetation, but they do not directly show the vegetation cover ratio of a specific area.

FVC is an important parameter to measure the size of the vegetated portion of the land surface; additionally, it is an important index for researching the aerosphere, hydrosphere, and biosphere. Moreover, FVC is extensively applied in fields such as agriculture, forestry, resource and environmental management, disaster risk monitoring, and drought monitoring (Gitelson et al., 2002; Purevdorj et al., 1998)

Accurate estimation of the FVC is required for research on land-surface processes, climate change, and numerical weather prediction (Zeng et al., 2000). Representatively, the FVC was applied in soil erosion models (RUSLE (Revised Universal Soil Loss Equation), SEMMA (Soil Erosion Model for Mountain Areas), and GeoWEPP (Geo-spatial interface for WEPP)) and atmospheric models (NOAH Land-Surface Model and NAM (North American Mesoscale) Eta model) (Choi et al., 2014; Gutman and Ignatov, 1998). In the past, FVC was estimated through ground-based methods. Conventional methods (ground-based methods) are usually time-consuming and impractical for large areas. In addition, these methods are unsuitable for real-time monitoring (Anderson and Gaston, 2013). Remote sensing information offers a unique way to obtain large-scale mapping of FVC. In particular, several studies indicate that space-borne sensors can be used to obtain spatially extensive information from landscapes on a global scale (Hu et al., 2007; Lamonaca et al., 2008; Pellikka et al., 2009; Propastin and Panferov, 2013).

Notwithstanding, spatial and temporal resolutions of satellite-based data improvements, high costs per scene, and unprofitable revisit times remain significant obstacles for many remote sensing applications. In particular, the sand dunes in Korea are too small to apply conventional methods. Limitations associated with traditional aerial imagery platforms can be overcome by using UAV (Unmanned Aerial Vehicle); in recent years, UAV have been developed into a new aerial platform for image acquisition with a tremendous potential for mapping vegetation cover for detailed vegetation studies with environmental and agricultural objectives (Bryson et al., 2010; García-Ruiz et al., 2013; Herwitz et al., 2004; Laliberte et al., 2006; Torres-Sánchez et al., 2013).

UAV provide high spatial resolution images and allow users to watch small, individual sites at low altitudes, which conventional methods do not (Xiang and Tian, 2011). Moreover, UAV can offer greater flexibility in scheduling imagery acquisition, regardless of cloudy days and the time needed to prepare. Other advantages of UAV are their lower cost and great flexibility of configuration compared with piloted aircraft, as it allows the utilization and testing of low-cost sensors, such as conventional digital cameras. Some researchers estimated green vegetation covers using on-ground imagery taken with commercial cameras (Guijarro et al., 2011; Meyer and Neto, 2008; Romeo et al., 2013). In addition to their low cost, another advantage of conventional digital cameras is their high resolution, which is needed when working in narrow rows of vegetation, such as weeds. While recent studies have tested the use of UAV-derived RGB images to estimate FVC, it still has limitations on the information provided on the properties of the vegetation (Jannoura et al., 2015).

Image analysis techniques for quantifying vegetation cover are generally based on the use of VIs (Vegetation Indices), which are the product of arithmetic operations performed with spectral information from the radiation reflected by the vegetation at different wavelengths (Xiao and Moody, 2005). Information derived from VIs is usually less sensitive to illumination and other factors affecting reflectance (Gitelson et al., 2002). The underlying mechanisms of VIs are well understood, and they emphasize some features of vegetation cover and facilitate obtaining relevant information from digital imagery (Delegido et al., 2013).

Studies conducted by Bendig et al. (2015) and Guillen-Climent et al. (2012) involve the use of NIR band in a majority of the adopted indices, because the near-infrared portion of the electromagnetic spectrum provides strong information on both the physiological status and the geometric properties of the vegetation. Furthermore, latest studies have tested the applicability of UAV-derived multi-spectral to make biomass and vegetation maps (Bendig et al., 2015; Feng et al., 2015).

In this study, we tested whether multi-spectral imaging can be used to obtain estimates of FVC from UAV. For this purpose, we used a small fixed-wing UAV equipped with a multi-spectral camera, which was tested over a sand dune. FVC derived from UAV imagery were calibrated against ground estimates, which were obtained from well-established FVC techniques.

 

2. Material and Methods

2.1 Study site

The study was carried out on April 2016 over a Pung Seong sand dune in U-I island located between Jindo and Daeheuksando, Shinan County, Jeonnam Province, Korea (34°36′N, 125°49′E; Fig. 1). The targeted area shows a high relief displacement with the straight-line distance of 40 m, the minimum altitude of 28 m, and the maximum altitude of 48 m and RGB images have been obtained on a regular basis since 2012 as the shape of the sand dune was drastically changed for a short period. The representative land cover of this area is as follows: trees, sands, dry grasses, grasses, rocks, and artificial structures (Fig. 2).

Fig. 1.(a) A Pung Seong sand dune in U-I island, (b) Panorama of the Pung Seong sand dune

Fig. 2.Representative land cover: (a) sands (b) trees, (c) dry grasses, (d) grasses, (e) artificial structures, (f) rocks

2.2 Multi-spectral image collection and pre-processing

Aerial images were collected with SenseFly eBee, which is a commercially available fixed-wing UAV equipped with a commercial 4 sensors of 1.2 MP multiSPEC 4C multi-spectral camera and Canon IXUS 127 HS (Fig. 3). The calculated survey parameters for UAV photogrammetry are described in Table 1. The ground pixel resolution was set to 15cm and 5cm, respectively, corresponding to an altitude of about 150m and 142m. The resolution (15 cm) of the multi-images was selected to make use of the DSM (Digital Surface Model) generated in the RGB images and the NDVI images generated in the multi-images, and different altitudes were inevitably used since it was impossible to fit the spatial resolution for the RGB images due to the conditions of the targeted area. The longitudinal and lateral image overlap was set to 90% and 80%, respectively.

Fig. 3.eBee

Table 1.Calculated survey parameters for UAV photogrammetry

The eBee flight plan was managed through SenseFly's eMotion software, and the flight was monitored through a laptop. The software requires inputs of initial parameters, such as area of interest, desired ground pixel resolution, side and longitudinal image overlap; it then automatically calculates the number of stripes to cover the areas of interest and the flight height.

The take-off/landing area was located about 0.5 km from the studied stands in a clearing close to the sand dune. Absolute positioning was based on a direct geo-referencing approach using the ground control point (Fig. 4). The coordinates of the ground control points are acquired by using the Network RTK (Real-Time Kinematic), and 8 fixed points like those on the road shown in Fig 4(b) are used to correct the result values for each time of their acquisition. Radiometric calibration is applied through combination of the two reflectance panels (Fig. 5).

Fig. 4.Ground control point

Fig. 5.Calibration panels

Images were then processed using the PX4D software. The software processing is based on a conventional photogrammetric approach: an automated image matching algorithm identifies tie points in the images, which were used to retrieve orientation parameters of the aerial triangulation (bundle-block adjustment). Color balancing between images with histogram matching was applied during ortho-mosaicking.

 

3. Fractional Vegetation Cover Processing and Analysis

3.1 Fractional vegetation cover processing

When estimating the unit area per unit of vegetation cover, calculations are usually done according to the maximum spatial resolution of the image. However, the vegetation cover for application to the numerical model is generally supported by the respective units to match the model. In this study, the unit was set at 1m for application on typical models based on FVC, such as RUSLE, SEMMA and GeoWEPP. While more detailed values of the model could be calculated with the smaller calculation unit of the FVC, the spatial resolution in other data (e.g., the amount of rainfall, soil map, etc.) of the model actually used could not support this, and the unit area was determined at 1 m since it took excessive time and cost for treatment with smaller unit area. However, it is ideal for the spatial resolution to have the maximum value for taking because the accuracy actually drops if the altitude is increased on the ground that the calculation unit gets larger.

Remote sensing retrieval can be divided into three models or methods using vegetation indices, such as the NDVI and the GI (Gutman and Ignatov) model, the regression model, and the decision tree method (Gessner, et al., 2009; Gutman et al., 1998; Rogan et al., 2002; Xiao and Moody, 2005). Gutman and Ignalov (1998) have developed a sub-pixel dichotomy model based on the model developed by Price (1993). They categorized the sub-pixel types of mosaic pixels as dense vegetation, non-dense vegetation, and mixeddensity vegetation. This general formula for NDVI was used for the approximation of fractional vegetation cover (fc):

which can be rewritten as:

where NDVIveg: the pure green vegetation pixel of site, NDVIsoil and : the bare soil of site.

We used the regression to estimate NDVI by assuming that green vegetation and bare soil have 100% and 0% vegetation, respectively. Gutman et al. (1998) estimated that NDVIveg and NDVIsoil have 0.52 and 0.04, respectively; this was calculated based on the Global AVHRR. Choi et al. (2014) set the value of the parameter for the GI model based on the Landsat 8 OLI that green vegetation and bare soil have 0.86 and 0.14, respectively.

As computer technology develops, machine learning methods are increasingly applied to the estimation of FVC. These methods are mainly used for the extraction of the vegetation cover from pixels using the results of the land cover classification. This is the general formulation to pixel for the approximation of : fc

where pixels(vg): the number of pixels of a vegetation pixel, and pixels(vg+nonvg): the number of pixels of the total area.

Pal (2005) concluded that the RF classifiers are less than the minimum number required for SVM (Support Vector Machine); further, they easier to define parameters and have a fair margin of profit of classification accuracy and training time. Feng et al. (2015) conducted high-accuracy urban vegetation mapping using a RF classifier from RGB images. In this study, we examined the usefulness of multi-spectral imaging for well-established FVC. Thus, we compared the conventional method of RGB-based imaging with the multi-spectral imaging method.

The steps involved in estimating FVC from UAV images are summarized in Fig. 6 and outlined below. First, both RGB and multi-spectral images were cropped to fit each targeted stand extent. Each cropped image was then processed to estimate FVC.

Fig. 6.Workflow

Then, FVC was estimated using the GI model and RF method, and the relative routine was coded by the authors in MATLAB. The GI model utilized the NOAH Land-Surface Model and the NAM Eta model; this model has a higher accuracy than the regression model (Xiao and Moody, 2005). In recent times, most studies have used RF for the prediction of a target, and this method has shown high levels of accuracy (Larivière and Van den Poel, 2005; Lunetta et al., 2004; Schwender et al., 2004)

Ground truth segmentation values were manually estimated by two individuals, and each pixel was classified as either vegetation or non-vegetation. FVC percentage was calculated as the fraction of pixels classified as part of the vegetation. These values of FVC were taken as ground truth despite the known limitations of the technique, such as user bias, age-related color perception challenges, and other natural variations between users.

3.2 Accuracy assessment

The accuracy of each model was assessed by comparing the predicted FVC with the actual field of vegetation cover. We used cross-validation as an additional means of comparing the models. This technique provides a virtually unbiased estimator of the prediction error (Efron, 1983). A total of 23 samples were selected from the field measurements for model validation. The predicted and measured values of each sample were compared using the relative error:

where Δχ : the absolute error, and χ : the actual fc.

We also compared the models by computing RMSE (Root Mean Square Error), as shown below:

where : the predicted value for the sample i, Pi: the field-observed value for sample i, and N: the number of observations. RMSE measures the overall accuracy for all samples.

 

4. Results and Discussion

The primary aim of this study is to evaluate FVC-based multi-spectral imaging and product FVC mapping; both are available at cm-resolution derived from imagery acquired by a low flying fixed-wing UAV. We compared the performance in FVC estimation of case 1 (RF using RGB images), case 2 (RF using RGB and NDVI images) and case 3 (GI model).

The classification was fast and conveniently set to two types of vegetation and non-vegetation. However, it was classified into classes that existed in the study area in order to compare the conventional method using RGB-based imaging with the addition of NDVI. The RF results derived from the UAV image are shown in Fig. 7. The RF results showed rocks, sands, trees, artificial structures, dry grasses and grasses. The classified results were reclassified into vegetation and non-vegetation for estimating FVC. Each classification showed features that reflect the base material. The result of classification indicated that sandy areas and artificial structures have similar patterns and high extraction accuracy. On the other hand, case 2 tended to classify the dry grasses as structures and rocks against little or no vegetation area; also, case 1 had a tendency to classify non-vegetation classes, such as structures, in forests with shade.

Fig. 7.Classification result of RF: (a) RGB image, (b) result by 6 classes (case 1), (c) result by 6 classes (case 2), (d) reclassification result by 2 classes (case 1), (e) reclassification result by 2 classes (case 2)

The misclassifications shown Fig. 7 of a similar class in the non-vegetated area was not applied largely to the downside for FVC, while different classified properties of classes in vegetation areas was applied to a large disadvantage.

The FVC results derived are shown in Fig. 8. The FVC map from the case 1, case 2, and the case 3 were similar on sandy areas, especially in non-vegetated regions. The predictions of case 1 overestimated FVC for this region. The resulting map showed some methods due to overestimated vegetation classes. Second, the case 2 and the case 3 showed a similar vegetation size, but the value of FVC was higher than in the case 3. Finally, the case 3 gave a lower value than other methods; also, the vegetation area was small but it had less noise compared to other methods.

Fig. 8.FVC map using RF and GI model: (a) RGB image, (b) case 1, (c) case 2, (d) case 3

In terms of FVC results (Fig. 9), the case 3 had a high fit with ground truth and a low RMSE of 0.182. On the other hand, case 2 and case 3 showed a lower accuracy than the case 3; the order was case 2 and case 1, and the RMSE was 0.194 and 0.204, respectively. These results can be seen in more detail in Fig. 10. The results of FVC using case 1 and case 2 shows over or under estimation, but the case 3 provided suitable results against ground truth.

Fig. 9.Comparison of methods for ground truth data

Fig. 10.Result of FVC between ground truth and the methods used: (a) Image, (b) case 1, (c) case 2, (d) case 3

In this study, we obtained accurate measurements of FVC from multi-spectral images in fixed-wing UAV. We attributed the results mainly to the high image resolution obtainable from UAV platforms. The FVC results were comparable to those obtained using ground-based methods. On the other hand, we observed that GI model can be applied of high resolution, leading to a measurement of a medium resolution (Choi et al., 2014). Also, when classifying using RF, the classes were accurately able to detect vegetation but also non-vegetation. The results indicated that much higher spatial resolution may be required to improve the performance of the method in sand dunes located across small areas; this can be easily achieved using UAV platforms by acquiring images at lower altitudes and/or utilizing cameras with higher pixel resolution.

It appears from the results shown in Figs. 8 and 9 that the GI model is robust method for FVC estimation in sand dunes; the results obtained through case 1 and case 2 have relatively low accuracy compared with the case 3.

The images were taken in early spring. case 1 showed more over-classification in grasses than other methods because RGB images cannot offer the vitality of vegetation. Although case 1 produces a higher accuracy in some grass areas than others, this result should be regarded with caution. The performance of case 1 varies significantly across seasons due to the seasonal nature of vegetation.

A comparison with FVC estimates from multi-spectral images revealed that leaf inclination strongly influences the optical properties of vegetation, and thus the indirect estimation of FVC from NDVI. However, RGB was able to eliminate the influence of vegetation conditions by taking into account the visible band. In the current study, good estimates of FVC were obtained by adopting a NIR band, which was inferred from a previous study conducted in vegetation area. However, it was not actually applied to a variety of methods in the sand dune. Our results suggest that the optimal method for FVC is to use multi-spectral imaging UAV, i.e., using the NIR band suited for FVC in the sand dune. This was also confirmed by the RF and GI model testing against FVC from UAV, based on the multi-spectral images obtained. Based on the above considerations, we strongly recommend that multi-spectral imaging be the preferred method to estimate FVC from UAV. In particular, the GI model without RGB-based imaging show significantly higher accuracy than the RF method. This can be easily achieved by using a UAV, which can provide vegetation data to improve the estimation of FVC.

 

5. Conclusion

In this study, we tested whether multi-spectral imaging can be used to obtain estimates of FVC from UAV. The statistical analysis showed that multi-spectral imaging is a suitable indicator for vegetation as well as base data for estimating FVC from the RF and GI model. In particular In particular, a high accuracy FVC map can be obtained in the GI model without the use of RGB-based imaging. In fact, we found that the multi-spectral imaging showed a better ability to FVC on sand dunes compared with RGB-based imaging. Also, in general, most researchers have applied the GI model only in low resolution environments due to the difficulty in determining the value of each parameter. However, this method proved to be highly effective in obtaining high-resolution images by using the value extracted from previous studies.

In sum, while the quality of FVC using the RF method determines the classification result, the GI mode offers to ensure a high quality of FVC, if the NDVI is maintained at a uniform level. Thus, using an UAV with multi-spectral imaging to calculate the vegetation cover is a more accurate way to estimate FVC on small sand dunes research fields. It is a simple, timely alternative to cost-intensive and complex ground-based measurements.

References

  1. Anderson, K. and Gaston, K. J. (2013), Lightweight unmanned aerial vehicles will revolutionize spatial ecology, Frontiers in Ecology and the Environment, Vol. 11, No. 3, pp. 138-146. https://doi.org/10.1890/120150
  2. Bendig, J., Yu, K., Aasen, H., Bolten, A., Bennertz, S., Broscheit, J., and Bareth, G. (2015), Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley, International Journal of Applied Earth Observation and Geoinformation, Vol. 39, pp. 79-87. https://doi.org/10.1016/j.jag.2015.02.012
  3. Bryson, M., Reid, A., Ramos, F., and Sukkarieh, S. (2010), Airborne vision-based mapping and classification of large farmland environments, Journal of Field Robotics, Vol. 27, No. 5, pp. 632-655. https://doi.org/10.1002/rob.20343
  4. Choi, S., Lee, S., and Wang, B. (2014), Analysis of vegetation cover fraction on landsat OLI using NDVI, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 32, No. 1, pp. 9-17. (in Korean with English abstract) https://doi.org/10.7848/ksgpc.2014.32.1.9
  5. Delegido, J., Verrelst, J., Meza, C. M., Rivera, J. P., Alonso, L., and Moreno, J. (2013), A red-edge spectral index for remote sensing estimation of green LAI over agroecosystems, European Journal of Agronomy, Vol. 46, pp. 42-52. https://doi.org/10.1016/j.eja.2012.12.001
  6. Efron, B. (1983), Estimating the error rate of a prediction rule: improvement on cross-validation, Journal of the American Statistical Association, Vol. 78, No. 382, pp. 316-331. https://doi.org/10.1080/01621459.1983.10477973
  7. Feng, Q., Liu, J., and Gong, J. (2015), UAV remote sensing for urban vegetation mapping using random forest and texture analysis, Remote Sensing, Vol. 7, No. 1, pp. 1074-1094. https://doi.org/10.3390/rs70101074
  8. García-Ruiz, J. M., Nadal-Romero, E., Lana-Renault, N., and Beguería, S. (2013), Erosion in Mediterranean landscapes: changes and future challenges, Geomorphology, Vol. 198, pp. 20-36. https://doi.org/10.1016/j.geomorph.2013.05.023
  9. Gessner, U., Klein, D., Conrad, C., Schmidt, M., and Dech, S. (2009), Towards an automated estimation of vegetation cover fractions on multiple scales: Examples of Eastern and Southern Africa, In Proceedings of the 33rd International Symposium on Remote Sensing of Environment, International Center for Remote Sensing of Environment, 4-8 May, Stresa, Italy, pp. 1–4
  10. Gitelson, A. A., Kaufman, Y.J., Stark, R., and Rundquist, D. (2002), Novel algorithms for remote estimation of vegetation fraction, Remote sensing of Environment, Vol. 80, No. 1, pp. 76–87. https://doi.org/10.1016/S0034-4257(01)00289-9
  11. Guijarro, M., Pajares, G., Riomoros, I., Herrera, P. J., Burgos-Artizzu, X. P., and Ribeiro, A. (2011), Automatic segmentation of relevant textures in agricultural images, Computers and Electronics in Agriculture, Vol. 75, No. 1, pp. 75-83. https://doi.org/10.1016/j.compag.2010.09.013
  12. Guillen-Climent, M. L., Zarco-Tejada, P. J., Berni, J. A., North, P. R. J., and Villalobos, F. J. (2012), Mapping radiation interception in row-structured orchards using 3D simulation and high-resolution Airborne imagery acquired from a UAV, Precision Agriculture, Vol. 13, No. 4, pp. 473-500. https://doi.org/10.1007/s11119-012-9263-8
  13. Gutman, G. and Ignatov, A. (1998), The derivation of the green vegetation fraction from NOAA/AVHRR data for use in numerical weather prediction models, International Journal of Remote Sensing, Vol. 19, No. 8, pp. 1533-1543. https://doi.org/10.1080/014311698215333
  14. Herwitz, S. R., Johnson, L. F., Dunagan, S. E., Higgins, R. G., Sullivan, D. V., Zheng, J., and Slye, R. E. (2004), Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support, Computers and Electronics in Agriculture, Vol. 44, No. 1, pp. 49-61. https://doi.org/10.1016/j.compag.2004.02.006
  15. Hu, Z. Q., He, F. Q., Yin, J. Z., Xia, L. U., Tang, S. L., Wang, L. L., and Li, X. J. (2007), Estimation of fractional vegetation cover based on digital camera survey data and a remote sensing model, Journal of China University of Mining and Technology, Vol. 17, No. 1, pp. 116-120. https://doi.org/10.1016/S1006-1266(07)60025-X
  16. Jannoura, R., Brinkmann, K., Uteau, D., Bruns, C., and Joergensen, R. G. (2015), Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter, Biosystems Engineering, Vol. 129, pp. 341-351. https://doi.org/10.1016/j.biosystemseng.2014.11.007
  17. Lamonaca, A., Corona, P., and Barbati, A. (2008), Exploring forest structural complexity by multi-scale segmentation of VHR imagery, Remote Sensing of Environment, Vol. 112, No. 6, pp. 2839-2849. https://doi.org/10.1016/j.rse.2008.01.017
  18. Laliberte, A. S., Rango, A., and Fredrickson, E. L. (2006), Separating green and senescent vegetation in very high resolution photography using an intensity-hue-saturation transformation and object based classification, In Proceedings of the American Society for Photogrammetry and Remote Sensing Annual Conference, ASPRS, 1-5 May, Reno, Nevada, pp. 1-5.
  19. Larivière, B. and Van den Poel, D. (2005), Predicting customer retention and profitability by using random forests and regression forests techniques, Expert Systems with Applications, Vo. 29, No. 2, pp. 472-484. https://doi.org/10.1016/j.eswa.2005.04.043
  20. Lunetta, R. S. and Lyon, J. G. (2004), Remote Sensing and GIS Accuracy Assessment, CRC press, Boca Raton, Florida.
  21. Meyer, G. E. and Neto, J. C. (2008), Verification of color vegetation indices for automated crop imaging applications, Computers and Electronics in Agriculture, Vo. 63, No. 2, pp. 282-293. https://doi.org/10.1016/j.compag.2008.03.009
  22. Pal, M. (2005), Random forest classifier for remote sensing classification, International Journal of Remote Sensing, Vol. 26, No. 1, pp. 217-222. https://doi.org/10.1080/01431160412331269698
  23. Pellikka, P. K., Lötjönen, M., Siljander, M., and Lens, L. (2009), Airborne remote sensing of spatiotemporal change (1955–2004) in indigenous and exotic forest cover in the Taita Hills, Kenya, International Journal of Applied Earth Observation and Geoinformation, Vol. 11, No. 4, pp. 221-232. https://doi.org/10.1016/j.jag.2009.02.002
  24. Purevdorj, T. S., Tateishi, R., Ishiyama, T., and Honda, Y. (1998), Relationships between percent vegetation cover and vegetation indices, International Journal of Remote Sensing, Vol. 19, No. 18, pp. 3519–3535. https://doi.org/10.1080/014311698213795
  25. Price, J. C. (1993), Estimating leaf area index from satellite data, IEEE Transactions on Geoscience and Remote Sensing, Vol. 31, No. 3, pp. 727-734. https://doi.org/10.1109/36.225538
  26. Propastin, P. and Panferov, O. (2013), Retrieval of remotely sensed LAI using Landsat ETM+ data and ground measurements of solar radiation and vegetation structure: Implication of leaf inclination angle, International Journal of Applied Earth Observation and Geoinformation, Vol. 25, pp. 38-46. https://doi.org/10.1016/j.jag.2013.02.006
  27. Romeo, J., Pajares, G., Montalvo, M., Guerrero, J. M., Guijarro, M., and De La Cruz, J. M. (2013), A new expert system for greenness identification in agricultural images, Expert Systems with Applications, Vol. 40, No. 6, pp. 2275-2286. https://doi.org/10.1016/j.eswa.2012.10.033
  28. Rogan, J., Franklin, J., and Roberts, D. A. (2002), A comparison of methods for monitoring multitemporal vegetation change using Thematic Mapper imagery, Remote Sensing of Environment, Vol. 80, No. 1, pp. 143-156. https://doi.org/10.1016/S0034-4257(01)00296-6
  29. Schwender, H., Zucknick, M., Ickstadt, K., and Bolt, H. M. (2004), A pilot study on the application of statistical classification procedures to molecular epidemiological data, Toxicology Letters, Vol. 151, No. 1, pp. 291-299. https://doi.org/10.1016/j.toxlet.2004.02.021
  30. Torres-Sánchez, J., López-Granados, F., De Castro, A. I., and Peña-Barragán, J. M. (2013), Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed managemen, PLoS One, Vol. 8, No. 3, e58210. https://doi.org/10.1371/journal.pone.0058210
  31. Xiang, H. and Tian, L. (2011), Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform, Biosystems Engineering, Vol. 108, No. 2, pp. 104-113. https://doi.org/10.1016/j.biosystemseng.2010.11.003
  32. Xiao, J. and Moody, A. (2005), A comparison of methods for estimating fractional green vegetation cover within a desert-to-upland transition zone in central New Mexico, Remote Sensing of Environment, Vol. 98, No. 2, pp. 237-250. https://doi.org/10.1016/j.rse.2005.07.011
  33. Zeng, X., Dickinson, R. E., Walker, A., Shaikh, M., DeFries, R. S., and Qi, J. (2000), Derivation and evaluation of global 1-km fractional vegetation cover data for land modeling, Journal of Applied Meteorology, Vol. 39, No. 6, pp. 826-839. https://doi.org/10.1175/1520-0450(2000)039<0826:DAEOGK>2.0.CO;2

Cited by

  1. UAV-based Land Cover Mapping Technique for Monitoring Coastal Sand Dunes vol.35, pp.1, 2016, https://doi.org/10.7848/ksgpc.2017.35.1.11
  2. 소규모 사구 지역 바람-식생모델 적용성 분석 vol.35, pp.6, 2016, https://doi.org/10.7848/ksgpc.2017.35.6.545