• Title/Summary/Keyword: a pool of classifiers

Search Result 6, Processing Time 0.022 seconds

Construction of Multiple Classifier Systems based on a Classifiers Pool (인식기 풀 기반의 다수 인식기 시스템 구축방법)

  • Kang, Hee-Joong
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.8
    • /
    • pp.595-603
    • /
    • 2002
  • Only a few studies have been conducted on how to select multiple classifiers from the pool of available classifiers for showing the good classification performance. Thus, the selection problem if classifiers on how to select or how many to select still remains an important research issue. In this paper, provided that the number of selected classifiers is constrained in advance, a variety of selection criteria are proposed and applied to tile construction of multiple classifier systems, and then these selection criteria will be evaluated by the performance of the constructed multiple classifier systems. All the possible sets of classifiers are trammed by the selection criteria, and some of these sets are selected as the candidates of multiple classifier systems. The multiple classifier system candidates were evaluated by the experiments recognizing unconstrained handwritten numerals obtained both from Concordia university and UCI machine learning repository. Among the selection criteria, particularly the multiple classifier system candidates by the information-theoretic selection criteria based on conditional entropy showed more promising results than those by the other selection criteria.

Selecting Classifiers using Mutual Information between Classifiers (인식기 간의 상호정보를 이용한 인식기 선택)

  • Kang, Hee-Joong
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.3
    • /
    • pp.326-330
    • /
    • 2008
  • The study on combining multiple classifiers in the field of pattern recognition has mainly focused on how to combine multiple classifiers, but it has gradually turned to the study on how to select multiple classifiers from a classifier pool recently. Actually, the performance of multiple classifier system depends on the selected classifiers as well as the combination method of classifiers. Therefore, it is necessary to select a classifier set showing good performance, and an approach based on information theory has been tried to select the classifier set. In this paper, a classifier set candidate is made by the selection of classifiers, on the basis of mutual information between classifiers, and the classifier set candidate is compared with the other classifier sets chosen by the different selection methods in experiments.

Coarse-to-fine Classifier Ensemble Selection using Clustering and Genetic Algorithms (군집화와 유전 알고리즘을 이용한 거친-섬세한 분류기 앙상블 선택)

  • Kim, Young-Won;Oh, Il-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.9
    • /
    • pp.857-868
    • /
    • 2007
  • The good classifier ensemble should have a high complementarity among classifiers in order to produce a high recognition rate and its size is small in order to be efficient. This paper proposes a classifier ensemble selection algorithm with coarse-to-fine stages. for the algorithm to be successful, the original classifier pool should be sufficiently diverse. This paper produces a large classifier pool by combining several different classification algorithms and lots of feature subsets. The aim of the coarse selection is to reduce the size of classifier pool with little sacrifice of recognition performance. The fine selection finds near-optimal ensemble using genetic algorithms. A hybrid genetic algorithm with improved searching capability is also proposed. The experimentation uses the worldwide handwritten numeral databases. The results showed that the proposed algorithm is superior to the conventional ones.

Optimal Classifier Ensemble Design for Vehicle Detection Using GAVaPS (자동차 검출을 위한 GAVaPS를 이용한 최적 분류기 앙상블 설계)

  • Lee, Hee-Sung;Lee, Jae-Hung;Kim, Eun-Tai
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.1
    • /
    • pp.96-100
    • /
    • 2010
  • This paper proposes novel genetic design of optimal classifier ensemble for vehicle detection using Genetic Algorithm with Varying Population Size (GAVaPS). Recently, many classifiers are used in classifier ensemble to deal with tremendous amounts of data. However the problem has a exponential large search space due to the increasing the number of classifier pool. To solve this problem, we employ the GAVaPS which outperforms comparison with simple genetic algorithm (SGA). Experiments are performed to demonstrate the efficiency of the proposed method.

A Co-training Method based on Classification Using Unlabeled Data (비분류표시 데이타를 이용하는 분류 기반 Co-training 방법)

  • 윤혜성;이상호;박승수;용환승;김주한
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.8
    • /
    • pp.991-998
    • /
    • 2004
  • In many practical teaming problems including bioinformatics area, there is a small amount of labeled data along with a large pool of unlabeled data. Labeled examples are fairly expensive to obtain because they require human efforts. In contrast, unlabeled examples can be inexpensively gathered without an expert. A common method with unlabeled data for data classification and analysis is co-training. This method uses a small set of labeled examples to learn a classifier in two views. Then each classifier is applied to all unlabeled examples, and co-training detects the examples on which each classifier makes the most confident predictions. After some iterations, new classifiers are learned in training data and the number of labeled examples is increased. In this paper, we propose a new co-training strategy using unlabeled data. And we evaluate our method with two classifiers and two experimental data: WebKB and BIND XML data. Our experimentation shows that the proposed co-training technique effectively improves the classification accuracy when the number of labeled examples are very small.

Diversity based Ensemble Genetic Programming for Improving Classification Performance (분류 성능 향상을 위한 다양성 기반 앙상블 유전자 프로그래밍)

  • Hong Jin-Hyuk;Cho Sung-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.12
    • /
    • pp.1229-1237
    • /
    • 2005
  • Combining multiple classifiers has been actively exploited to improve classification performance. It is required to construct a pool of accurate and diverse base classifier for obtaining a good ensemble classifier. Conventionally ensemble learning techniques such as bagging and boosting have been used and the diversify of base classifiers for the training set has been estimated, but there are some limitations in classifying gene expression profiles since only a few training samples are available. This paper proposes an ensemble technique that analyzes the diversity of classification rules obtained by genetic programming. Genetic programming generates interpretable rules, and a sample is classified by combining the most diverse set of rules. We have applied the proposed method to cancer classification with gene expression profiles. Experiments on lymphoma cancer dataset, prostate cancer dataset and ovarian cancer dataset have illustrated the usefulness of the proposed method. h higher classification accuracy has been obtained with the proposed method than without considering diversity. It has been also confirmed that the diversity increases classification performance.