• Title/Summary/Keyword: Nearest Prototype classifier

Search Result 5, Processing Time 0.018 seconds

Design of Nearest Prototype Classifier by using Differential Evolutionary Algorithm (차분진화 알고리즘을 이용한 Nearest Prototype Classifier 설계)

  • Roh, Seok-Beom;Ahn, Tae-Chon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.4
    • /
    • pp.487-492
    • /
    • 2011
  • In this paper, we proposed a new design methodology to improve the classification performance of the Nearest Prototype Classifier which is one of the simplest classification algorithm. To optimize the position vectors of the prototypes in the nearest prototype classifier, we use the differential evolutionary algorithm. The optimized position vectors of the prototypes result in the improvement of the classification performance. The new method to determine the class labels of the prototypes, which are defined by the differential evolutionary algorithm, is proposed. In addition, the experimental application covers a comparative analysis including several previously commonly encountered methods.

Nearest-neighbor Rule based Prototype Selection Method and Performance Evaluation using Bias-Variance Analysis (최근접 이웃 규칙 기반 프로토타입 선택과 편의-분산을 이용한 성능 평가)

  • Shim, Se-Yong;Hwang, Doo-Sung
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.10
    • /
    • pp.73-81
    • /
    • 2015
  • The paper proposes a prototype selection method and evaluates the generalization performance of standard algorithms and prototype based classification learning. The proposed prototype classifier defines multidimensional spheres with variable radii within class areas and generates a small set of training data. The nearest-neighbor classifier uses the new training set for predicting the class of test data. By decomposing bias and variance of the mean expected error value, we compare the generalization errors of k-nearest neighbor, Bayesian classifier, prototype selection using fixed radius and the proposed prototype selection method. In experiments, the bias-variance changing trends of the proposed prototype classifier are similar to those of nearest neighbor classifiers with all training data and the prototype selection rates are under 27.0% on average.

Prototype based Classification by Generating Multidimensional Spheres per Class Area (클래스 영역의 다차원 구 생성에 의한 프로토타입 기반 분류)

  • Shim, Seyong;Hwang, Doosung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.20 no.2
    • /
    • pp.21-28
    • /
    • 2015
  • In this paper, we propose a prototype-based classification learning by using the nearest-neighbor rule. The nearest-neighbor is applied to segment the class area of all the training data into spheres within which the data exist from the same class. Prototypes are the center of spheres and their radii are computed by the mid-point of the two distances to the farthest same class point and the nearest another class point. And we transform the prototype selection problem into a set covering problem in order to determine the smallest set of prototypes that include all the training data. The proposed prototype selection method is based on a greedy algorithm that is applicable to the training data per class. The complexity of the proposed method is not complicated and the possibility of its parallel implementation is high. The prototype-based classification learning takes up the set of prototypes and predicts the class of test data by the nearest neighbor rule. In experiments, the generalization performance of our prototype classifier is superior to those of the nearest neighbor, Bayes classifier, and another prototype classifier.

Prototype-based Classifier with Feature Selection and Its Design with Particle Swarm Optimization: Analysis and Comparative Studies

  • Park, Byoung-Jun;Oh, Sung-Kwun
    • Journal of Electrical Engineering and Technology
    • /
    • v.7 no.2
    • /
    • pp.245-254
    • /
    • 2012
  • In this study, we introduce a prototype-based classifier with feature selection that dwells upon the usage of a biologically inspired optimization technique of Particle Swarm Optimization (PSO). The design comprises two main phases. In the first phase, PSO selects P % of patterns to be treated as prototypes of c classes. During the second phase, the PSO is instrumental in the formation of a core set of features that constitute a collection of the most meaningful and highly discriminative coordinates of the original feature space. The proposed scheme of feature selection is developed in the wrapper mode with the performance evaluated with the aid of the nearest prototype classifier. The study offers a complete algorithmic framework and demonstrates the effectiveness (quality of solution) and efficiency (computing cost) of the approach when applied to a collection of selected data sets. We also include a comparative study which involves the usage of genetic algorithms (GAs). Numerical experiments show that a suitable selection of prototypes and a substantial reduction of the feature space could be accomplished and the classifier formed in this manner becomes characterized by low classification error. In addition, the advantage of the PSO is quantified in detail by running a number of experiments using Machine Learning datasets.

Face Recognition using Eigenface (고유얼굴에 의한 얼굴인식)

  • 박중조;김경민
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.2 no.2
    • /
    • pp.1-6
    • /
    • 2001
  • Eigenface method in face recognition is useful due to its insensitivity to large variations in facial expression and facial details. However its low recognition rate necessitates additional researches. In this paper, we present an efficient method for improving the recognition rate in face recognition using eigenface feature. For this, we performs a comparative study of three different classifiers which are i) a single prototype (SP) classifier, ii) a nearest neighbor (NN) classifier, and iii) a standard feedforward neural network (FNN) classifier. By evaluating and analyzing the performance of these three classifiers, we shows that the distribution of eigenface features of face image is not compact and that selections of classifier and sample training data are important for obtaining higher recognition rate. Our experiments with the ORL face database show that 1-NN classifier outperforms the SP and FNN classifiers. We have achieved a recognition rate of 91.0% by selecting sample trainging data properly and using 1-NN classifier.

  • PDF