DOI QR코드

DOI QR Code

Design of Lazy Classifier based on Fuzzy k-Nearest Neighbors and Reconstruction Error

퍼지 k-Nearest Neighbors 와 Reconstruction Error 기반 Lazy Classifier 설계

  • 노석범 (원광대학교 전자 및 제어 공학부) ;
  • 안태천 (원광대학교 전자 및 제어 공학부)
  • Received : 2009.10.22
  • Accepted : 2010.01.15
  • Published : 2010.02.25

Abstract

In this paper, we proposed a new lazy classifier with fuzzy k-nearest neighbors approach and feature selection which is based on reconstruction error. Reconstruction error is the performance index for locally linear reconstruction. When a new query point is given, fuzzy k-nearest neighbors approach defines the local area where the local classifier is available and assigns the weighting values to the data patterns which are involved within the local area. After defining the local area and assigning the weighting value, the feature selection is carried out to reduce the dimension of the feature space. When some features are selected in terms of the reconstruction error, the local classifier which is a sort of polynomial is developed using weighted least square estimation. In addition, the experimental application covers a comparative analysis including several previously commonly encountered methods such as standard neural networks, support vector machine, linear discriminant analysis, and C4.5 trees.

본 논문에서는 퍼지 k-NN과 reconstruction error에 기반을 둔 feature selection을 이용한 lazy 분류기 설계를 제안하였다. Reconstruction error는 locally linear reconstruction의 평가 지수이다. 새로운 입력이 주어지면, 퍼지 k-NN은 local 분류기가 유효한 로컬 영역을 정의하고, 로컬 영역 안에 포함된 데이터 패턴에 하중 값을 할당한다. 로컬 영역과 하중 값을 정의한 우에, feature space의 차원을 감소시키기 위하여 feature selection이 수행된다. Reconstruction error 관점에서 우수한 성능을 가진 여러 개의 feature들이 선택 되어 지면, 다항식의 일종인 분류기가 하중 최소자승법에 의해 결정된다. 실험 결과는 기존의 분류기인 standard neural networks, support vector machine, linear discriminant analysis, and C4.5 trees와 비교 결과를 보인다.

Keywords

References

  1. A.K. Jain, P. W. Duin, J. Mao, Statistical pattern recognition: a review, IEEE trans. pattern Anal. Mach. Intell. 22 (1) (2000) 4-37. https://doi.org/10.1109/34.824819
  2. C. L. Liu, H. Sako, Class-specific feature polynomial classifier for pattern classification and its application to hand written numeral recognition, Pattern recognition 39 (2006) 669-681. https://doi.org/10.1016/j.patcog.2005.04.021
  3. U. Krebel, J. Schurmann, Pattern classification techniques based on function approximation, Handbook of character recognition and document image analysis, World Scientific, Singapore (1997) 49-78
  4. J Shurmann, Pattern classification: a unified view of statistical and nerual approaches, Wiley Interscience, New York (1996).
  5. L. Holmstrom, P. Koistinen, J. Laaksonen, E. Oja, Neural and statistical classifiers-taxonomy and two case studies, IEEE Trans. Neural Networks 8 (1) (1997) 5-17. https://doi.org/10.1109/72.554187
  6. W. Lam, C. K. Keung, C. X. Ling, Learning good prototypes for classification using filtering and abstraction of instances, Pattern Recognition 35 (2002) 1491-1506 https://doi.org/10.1016/S0031-3203(01)00131-5
  7. S.T. Roweis, L.K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science 290 (5500) (2000) 2323-2326. https://doi.org/10.1126/science.290.5500.2323
  8. P. S. Kang, S. Z Cho, Locally linear reconstruction for instance-based learning, Pattern Recognition 41 (2008) 3507-3518. https://doi.org/10.1016/j.patcog.2008.04.009
  9. Maya R. Gupta, Member, Eric K. Garcia and Erika Chin, "Adaptive Local Linear Regression with Application to Printer Color Management," IEEE Trans. on Image Processing, 17 (6) (2008) 936-945 https://doi.org/10.1109/TIP.2008.922429
  10. Hiphung Leung, Yingsong Huang and Changxiu Cao, "Locally weighted regression for desulphurization intelligent decision system modeling," Simulation Modeling Practice and Theory, 12 (2004) 413–423
  11. C.C. Atkeson et al., "Locally weighted learning," Artificial Intelligence Review (Special Issue on Lazy Learning), 11 (1-5) (1997) 11–73
  12. L. Blum, P. Langley, Selection of relevant features and examples in machine learning, Artif. Intell. (1997) 245–-271.
  13. H. Liu, H. Motoda, Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publisher, Dordrecht, 1998.
  14. I. Guyon, A. Elisseeff, An introduction to variable and feature selection, J. Mach. Learn. Res. (2003) 1157–1182.
  15. B. J. Park, W. Pedrycz, S. K. Oh, Polynomial-based radial basis function neural networks (P-RBFNNs) and their application to pattern classification, Applied Intelligence (http://www.springerlink.com/content/ m452347279jj2943/ on-line available). https://doi.org/10.1007/s10489-008-0133-z

Cited by

  1. -nearest neighbors, and support vector machine vol.92, pp.9, 2016, https://doi.org/10.1177/0037549716666962