Browse > Article
http://dx.doi.org/10.5391/JKIIS.2003.13.3.328

An Optimizing Hyperrectangle method for Nearest Hyperrectangle Learning  

Lee, Hyeong-Il (김포대학)
Publication Information
Journal of the Korean Institute of Intelligent Systems / v.13, no.3, 2003 , pp. 328-333 More about this Journal
Abstract
NGE (Nested Generalized Exemplars) proposed by Salzberg improved the storage requirement and classification rate of the Memory Based Reasoning. It constructs hyperrectangles during training and performs classification tasks. It worked not bad in many area, however, the major drawback of NGE is constructing hyperrectangles because its hyperrectangle is extended so as to cover the error data and the way of maintaining the feature weight vector. We proposed the OH (Optimizing Hyperrectangle) algorithm which use the feature weight vectors and the ED(Exemplar Densimeter) to optimize resulting Hyperrectangles. The proposed algorithm, as well as the EACH, required only approximately 40% of memory space that is needed in k-NN classifier, and showed a superior classification performance to the EACH. Also, by reducing the number of stored patterns, it showed excellent results in terms of classification when we compare it to the k-NN and the EACH.
Keywords
기계학습;에이전트시스템;정보검색;
Citations & Related Records
연도 인용수 순위
  • Reference
1 T. Kohonen, "Learning vector quantization for pattern recognition (Technical Report TKK-F-A601), Espoo, Finland : Helsinki University of Technology, Department of Technical Physics, 1986.
2 D. Aha, "A Study of Instance-Based Algorithms for Supervised Learning Tasks: Mathematical, Empirical, and Psychological Evaluations", Ph. D. Thesis, Information and Computer Science Dept., University of California, Irvine, 1990.
3 S. Cost and S. Salzberg, "A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features", Machine Learning, Vol. 10, No. 1, pp. 57-78, 1993.
4 최영희, 장수인, 유재수, 오재철, "수량적 연관규칙탐사를 위한 효율적인 고빈도 항목열 생성기법" 한국정보처리학회 논문지 제6권 제10호, pp2597-2607, 1999   과학기술학회마을
5 이형일, 정태선, 윤충화, 강경식, "재귀 분할 평균기법윤 이용한 새로운 메모리 기반 추폰 알고리즘" 한국정보 처리 학회 논문지 제6권 제7 호, pp1849-1857, 1999   과학기술학회마을
6 D. Wettschereck, "Weighted k-NN versus Majority k-NN A Recommendation", German National Research Center for Information Technology, 1995.
7 S. Salzberg, "A Nearest hyperrectangle learning method", Machine Learning, no. 1, pp. 251-276, 1991.
8 이형일, 정태선, 윤충화, "EACH 시스템에서의 새로운 가중치 적용법" 한국 정보과학회 "98 춘계학술 대회 논문집(B), 제25권 1호, pp288-290, 1998.
9 D. Wettschereck and T. Dietterich, "An Experimental Comparison of the Nearest-Neighbor and Nearest - Hyperrectangle Algorithms", Machine Learning, Vol. 19, No. 1, pp. 1-25, 1995.
10 D. Wettschereck, et al., "A Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms", Artificial Intelligence Review Journal, 1996.
11 D. Wettschereck and T. Dietterich, "Locally Adaptive Nearest Neighbor Algorithms", Advances in Neural Information Processing Systems 6, pp. 184-191, Morgan Kaufmann, San Mateo, CA. 1994.
12 D. Aha, "Instance-Based Learning Algorithms", Machine Learning, Vol. 6, No. 1, pp. 37-66, 1991.
13 S. Salzberg, "On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach", Data Mining and Knowledge Discovery, Vol. 1, pp. 1-11, 1997
14 T. Dietterich, "A Study of Distance-Based Machine Learning Algoritluns", Ph. D. Thesis, computer Science Dept., Oregon State University, 1995.