Browse > Article
http://dx.doi.org/10.9708/jksci/2012.17.11.011

Parameter Estimation in Debris Flow Deposition Model Using Pseudo Sample Neural Network  

Heo, Gyeongyong (Dept. of Electronic Engineering, Dong-Eui University)
Lee, Chang-Woo (Division of Forest Management, Korea Forest Research Institute)
Park, Choong-Shik (Dept. of Smart Information Technology, Youngdong University)
Abstract
Debris flow deposition model is a model to predict affected areas by debris flow and random walk model (RWM) was used to build the model. Although the model was proved to be effective in the prediction of affected areas, the model has several free parameters decided experimentally. There are several well-known methods to estimate parameters, however, they cannot be applied directly to the debris flow problem due to the small size of training data. In this paper, a modified neural network, called pseudo sample neural network (PSNN), was proposed to overcome the sample size problem. In the training phase, PSNN uses pseudo samples, which are generated using the existing samples. The pseudo samples smooth the solution space and reduce the probability of falling into a local optimum. As a result, PSNN can estimate parameter more robustly than traditional neural networks do. All of these can be proved through the experiments using artificial and real data sets.
Keywords
Parameter Estimation; Pseudo Sample; Neural Network; Debris Flow Deposition Model;
Citations & Related Records
연도 인용수 순위
  • Reference
1 L. Grady, "Random Walks for Image Segmentation," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 28, No. 11, pp. 1768-1783, Nov. 2006.   DOI   ScienceOn
2 Chang-Woo Lee, Choongshik Woo, and Ho-Joong Youn, "Analysis of Debris Flow Hazard Zone by the Optimal Parameters Extraction of Random Walk Model − Case on Debris Flow Area of Bonghwa County in Gyeongbuk Province," Journal of Korean Forest Society Vol. 100, No. 4, pp. 664-671, Apr. 2011.
3 R.P.W. Duin, "Small sample size generalization," Proceedings of the 9th Scandinavian Conference on Image Analysis, pp. 957-964, Oct. 1995.
4 S. Haykin, "Neural Networks: A Comprehensive Foundation," 2nd ed. Prentice Hall, 1998.
5 C.M. Bishop, "Pattern Recognition and Machine Learning," 2nd ed. Springer, 2007
6 R. Polikar, L. Udpa, S.S. Udpa, and V. Honavar, "Learn++: An Incremental Learning Algorithm for Supervised Neural Networks" IEEE Transactions on Systems,Man, and Cybernetics - Pact C: Applications and Reviews, Vol. 31, No. 4, pp. 497-508, Aug. 2001.   DOI   ScienceOn
7 D. Foley, "Considerations of sample and feature size," IEEE Transactions on Information Theory, Vol. 18, No. 5, pp. 618-628, Oct. 1972.   DOI
8 S. Uchimura, Y. Hamamoto, and S. Tomita, "Effects of the sample size in artificial neural network classifier design," Proceedings of the IEEE International Conference on Neural Networks, pp. 2126-2129, Dec. 1995.
9 T.G. Niel, T.R. McVicar, and B. Datt, "On the relationship between training sample size and data dimensionality: Monte Carlo analysis of broadband multi-temporal classification," Remote Sensing of Environment, Vol. 98, No. 4, pp. 468-480, Oct. 2005.   DOI   ScienceOn
10 D. Richard, "Probability: Theory and Examples," 4th ed. Cambridge University Press, 2004.