Browse > Article
http://dx.doi.org/10.3745/KIPSTD.2005.12D.6.829

Splitting Rules using Intervals for Object Classification in Image Databases  

Cho, June-Suh (한국외국어대학교 경영학부)
Choi, Joon-Soo (국민대학교 자연과학대학 컴퓨터학부)
Abstract
The way to assign a splitting criterion for correct object classification is the main issue in all decisions trees. This paper describes new splitting rules for classification in order to find an optimal split point. Unlike the current splitting rules that are provided by searching all threshold values, this paper proposes the splitting rules that we based on the probabilities of pre assigned intervals. Our methodology provides that user can control the accuracy of tree by adjusting the number of intervals. In addition, we applied the proposed splitting rules to a set of image data that was retrieved by parameterized feature extraction to recognize image objects.
Keywords
Classification; Probability; Interval; Object; Image;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone. Classification and Regression Trees. Wadsworth, 1984
2 R. Rastogi and K. Shim. PUBLIC A Decision Tree Classifier that Integrates Building and Pruning. In Proceedings of the 24nd VLDB Conference, 1998
3 Rodney M. Goodman and Padhraic J. Smyth. Decision tree design from a communication theory standpoint. IEEE Transactions on Information Theory, 1988   DOI   ScienceOn
4 D. M. Hawkins and G. V. Kass. Automatic Interaction Detection. Cambridge University Press, 1982
5 George H. John. Robust linear discriminant trees. In The 5th International Workshop on Artificial Intelligence and Statistics, 1995
6 T. Lim and W. Loh. A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-three Old and New Classification Algorithms. Dept. of Statistics in University of Wisconsin Technical Report 979, 1999
7 Wei-Yin Loh and Yu-Shan Shih. Split selection methods for classification trees. Statistica Sinica, 1997
8 Sreerama K. Murthy. On Growing Better Decision Trees From Data. Ph.D. Thesis in Johns Hopkins University, 1995
9 Sreerama K. Murthy, Simon Kasif, and Steven Salzberg. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research, 1994
10 Foster Provost, Tom Fawcett, and Ron Kohavi. The Case Against Accuracy Estimation for Comparing Induction Algorithms. In International Conference on Machine Learning, 1998
11 J. R Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993
12 Wray Buntine. Learning Classification Trees. Statistics and Computing, 1992   DOI
13 June-Suh Cho. Feature Extraction of Shape of Image Objects in Content-based Image Retrieval. The KIPS Transactions: Part B, 2003   과학기술학회마을
14 L. A. Clark and D. Pregibon. Tree-based models, in J. M. Chambers and T. J. Hastie(eds), Statistical Models in S. Chapman and Hall, New York, 1993
15 B. Efron. Estimating the error rate of a prediction rule: Improvement on cross-validation. Journal of the American Statistical Association, 1983   DOI
16 Johannes Gehrke, Venkatesh Ganti, Raghu Ramakrishnan, and Wei-Yin Loh. BOAT-Optimistic Decision Tree Construction. In SIGMOD'99, 1999   DOI
17 P. Bradely, The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognition, 1997   DOI   ScienceOn
18 Johannes Gehrke, Raghu Ramakrishnan, and Venkatesh Ganti. RainForest - A Framework for Fast Decision Tree Construction of Large Datasets. In Proceedings or the 24th VLDB Conference, New York, 1998
19 J. R. Beck and E. K Schultz. The use of ROC curves in test performance evaluation. Arch Pathol Lab Med, 1986