다중 클래스 데이터를 위한 분류오차 최소화기반 특징추출 기법

Optimizing Feature Extractioin for Multiclass problems Based on Classification Error

  • 최의선 (연세대학교 전기.컴퓨터공학과) ;
  • 이철희 (연세대학교 전기.컴퓨터공학과)
  • Choi, Eui-Sun (Dept. Electrical and Computer Eng., Yonsei University) ;
  • Lee, Chul-Hee (Dept. Electrical and Computer Eng., Yonsei University)
  • 발행 : 2000.03.25

초록

본 논문에서는 다중 클래스 데이터를 위한 특징 추출 방법을 최적화하는 기법을 제안한다 제안된 특징 추출 기법은 분류 오차에 기반한 방법으로 특징 공간(feature space)을 탐색하여 가우시안 최대우도 분류기 (Gaussian ML Classifier)의 분류오차(classification error)가 최소가 되도록 하는 특징벡터 집합을 구하는 방법이다 제안된 방법은 임의의 초기 특징벡터를 설정한 후 steepest descent 알고리즘을 적용하여 분류오차가 감소하는 방향으로 초기벡터를 갱신시킨다 본 논문에서는 순차탐색 및 전체탐색 두 가지의 방법을 제안하며 순차탐색은 추가로 특징벡터를 구하는 경우 이미 구해진 특징벡터를 포함하여 최소의 분류오차를 얻을 수 있는 특징벡터를 구한다 반면에 전체탐색 방법은 추가의 특징벡터를 구할 경우 새로운 초기 특징벡터 집합을 설정하여 이미 구해진 특징벡터를 포함하는 제약을 받지 않는다. 실험결과 제안된 두 가지 방법은 기존의 특징추출 방법보다 우수한 성능을 보여주고 있다.

In this paper, we propose an optimizing feature extraction method for multiclass problems assuming normal distributions. Initially, We start with an arbitrary feature vector Assuming that the feature vector is used for classification, we compute the classification error Then we move the feature vector slightly in the direction so that classification error decreases most rapidly This can be done by taking gradient We propose two search methods, sequential search and global search In the sequential search, an additional feature vector is selected so that it provides the best accuracy along with the already chosen feature vectors In the global search, we are not constrained to use the chosen feature vectors Experimental results show that the proposed algorithm provides a favorable performance.

키워드

참고문헌

  1. Y. Mallet, D. Coomans, J. Kautsky and Oliver De Vel 'Classification Using Adaptive Wavelets for Feature Extraction,' IEEE Trans. Computer, vol. 19, no. 10, pp. 1058-1066, 1997 https://doi.org/10.1109/34.625106
  2. J. A. Richards, Remote Sensing Digital Image Analysis. Springer-Verlag, 1993
  3. R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis. John Wiley & Sons, 1973
  4. C. Lee and D.A. Landgrebe, 'Feature extraction based on decision boundaries,' IEEE Transaction Pattern Analysis and Machine Intelligence, vol. 15, no.4, pp 388-400, 1993 https://doi.org/10.1109/34.206958
  5. C. Lee and J. Hong, 'Optimizing Feature Extraction for Multiclass cases,' IEEE Intl. Conf. on Systems Man and Cybernetics, pp. 2545-2548, 1997 https://doi.org/10.1109/ICSMC.1997.635317
  6. D. H. Foley and J. W. Sammon, 'An Optimal Set of Discriminant Vectors,' IEEE Trans. Computer, vol.C-24, no.3, pp. 281-289, 1975 https://doi.org/10.1109/T-C.1975.224208
  7. K. Fukunaga, Introduction to Statistical Pattern Recognition. New York: Academic Press, 1990
  8. X. Guorang, C. Peiqi, W. Minhui, 'Bhattacharyys distance Feature Selection,' IEEE Proc. ICPR '96, pp. 195-199, 1996
  9. H. P. Schwefel, Evolution and Optimum Seeking. John Wiley-Sons, Inc, 1994
  10. C. G. Cullen, Matrices and Linear Transformation Addison_wesley Publishing Company, 1972
  11. L. L. Biel and et. al, 'A Crops and Soils Data Base For Scene Radiation Research,' Proc. Machine Process of Remotely Sensed Data Symp., West Lafayette, Indiana, 1982
  12. C. Lee and D.A. Landgrebe, 'Analyzing High Dimensional Multispectral Imagery,' IEEE Trans. Geoscience and Remote Sensing, vol. 31, no. 4, pp. 792-800, 1993 https://doi.org/10.1109/36.239901
  13. S. Raudys, V. Pikelis, 'On Dimensionality, Sample Size, Classification Error, and Complexity of Classification Algorithm in Pattern Recognition,' IEEE Trans. on pattern anal & machine intel, vol 2, no. 3, 1990