Browse > Article
http://dx.doi.org/10.5391/JKIIS.2010.20.1.101

Design of Lazy Classifier based on Fuzzy k-Nearest Neighbors and Reconstruction Error  

Roh, Seok-Beom (원광대학교 전자 및 제어 공학부)
Ahn, Tae-Chon (원광대학교 전자 및 제어 공학부)
Publication Information
Journal of the Korean Institute of Intelligent Systems / v.20, no.1, 2010 , pp. 101-108 More about this Journal
Abstract
In this paper, we proposed a new lazy classifier with fuzzy k-nearest neighbors approach and feature selection which is based on reconstruction error. Reconstruction error is the performance index for locally linear reconstruction. When a new query point is given, fuzzy k-nearest neighbors approach defines the local area where the local classifier is available and assigns the weighting values to the data patterns which are involved within the local area. After defining the local area and assigning the weighting value, the feature selection is carried out to reduce the dimension of the feature space. When some features are selected in terms of the reconstruction error, the local classifier which is a sort of polynomial is developed using weighted least square estimation. In addition, the experimental application covers a comparative analysis including several previously commonly encountered methods such as standard neural networks, support vector machine, linear discriminant analysis, and C4.5 trees.
Keywords
Lazy Learning; Local Learning; Lazy Classifier; Locally Linear Reconstruction; Feature Selection; Fuzzy k-Nearest Neighbors;
Citations & Related Records
연도 인용수 순위
  • Reference
1 L. Holmstrom, P. Koistinen, J. Laaksonen, E. Oja, Neural and statistical classifiers-taxonomy and two case studies, IEEE Trans. Neural Networks 8 (1) (1997) 5-17.   DOI
2 W. Lam, C. K. Keung, C. X. Ling, Learning good prototypes for classification using filtering and abstraction of instances, Pattern Recognition 35 (2002) 1491-1506   DOI
3 S.T. Roweis, L.K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science 290 (5500) (2000) 2323-2326.   DOI   ScienceOn
4 P. S. Kang, S. Z Cho, Locally linear reconstruction for instance-based learning, Pattern Recognition 41 (2008) 3507-3518.   DOI
5 Maya R. Gupta, Member, Eric K. Garcia and Erika Chin, "Adaptive Local Linear Regression with Application to Printer Color Management," IEEE Trans. on Image Processing, 17 (6) (2008) 936-945   DOI
6 Hiphung Leung, Yingsong Huang and Changxiu Cao, "Locally weighted regression for desulphurization intelligent decision system modeling," Simulation Modeling Practice and Theory, 12 (2004) 413–423
7 C.C. Atkeson et al., "Locally weighted learning," Artificial Intelligence Review (Special Issue on Lazy Learning), 11 (1-5) (1997) 11–73
8 L. Blum, P. Langley, Selection of relevant features and examples in machine learning, Artif. Intell. (1997) 245–-271.
9 H. Liu, H. Motoda, Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publisher, Dordrecht, 1998.
10 I. Guyon, A. Elisseeff, An introduction to variable and feature selection, J. Mach. Learn. Res. (2003) 1157–1182.
11 B. J. Park, W. Pedrycz, S. K. Oh, Polynomial-based radial basis function neural networks (P-RBFNNs) and their application to pattern classification, Applied Intelligence (http://www.springerlink.com/content/ m452347279jj2943/ on-line available).   DOI
12 J Shurmann, Pattern classification: a unified view of statistical and nerual approaches, Wiley Interscience, New York (1996).
13 A.K. Jain, P. W. Duin, J. Mao, Statistical pattern recognition: a review, IEEE trans. pattern Anal. Mach. Intell. 22 (1) (2000) 4-37.   DOI
14 C. L. Liu, H. Sako, Class-specific feature polynomial classifier for pattern classification and its application to hand written numeral recognition, Pattern recognition 39 (2006) 669-681.   DOI
15 U. Krebel, J. Schurmann, Pattern classification techniques based on function approximation, Handbook of character recognition and document image analysis, World Scientific, Singapore (1997) 49-78