• Title/Summary/Keyword: Combining classifier

Search Result 104, Processing Time 0.03 seconds

An Approach to Combining Classifier with MIMO Fuzzy Model

  • Kim, Do-Wan;Park, Jin-Bae;Lee, Yeon-Woo;Joo, Young-Hoon
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.05a
    • /
    • pp.182-185
    • /
    • 2003
  • This paper presents a new design algorithm for the combination with the fuzzy classifier and the Bayesian classifier. Only few attempts have so far been made at providing an effective design algorithm combining the advantages and removing the disadvantages of two classifiers. Specifically, the suggested algorithms are composed of three steps: the combining, the fuzzy-set-based pruning, and the fuzzy set tuning. In the combining, the multi-inputs and multi-outputs (MIMO) fuzzy model is used to combine two classifiers. In the fuzzy-set-based pruning, to effectively decrease the complexity of the fuzzy-Bayesian classifier and the risk of the overfitting, the analysis method of the fuzzy set and the recursive pruning method are proposesd. In the fuzzy set tuning for the misclassified feature vectors, the premise parameters are adjusted by using the gradient decent algorithm. Finally, to show the feasibility and the validity of the proposed algorithm, a computer simulation is provided.

  • PDF

A New Approach to the Design of Combining Classifier Based on Immune Algorithm

  • Kim, Moon-Hwan;Jeong, Keun-Ho;Joo, Young-Hoon;Park, Jin-Bae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1272-1277
    • /
    • 2003
  • This paper presents a method for combining classifier which is constructed by fuzzy and neural network classifiers and uses classifier fusion algorithms and selection algorithms. The input space of combing classifier is divided by the extended hyperbox region proposed in this paper to guarantee non-overlapped data property. To fuse the fuzzy classifier and the neural network classifier, we propose the fusion parameter for the overlapped data. In addition, the adaptive learning algorithm also proposed to maximize classifier performance. Finally, simulation examples are given to illustrate the effectiveness of the method.

  • PDF

Performance Improvement of Classifier by Combining Disjunctive Normal Form features

  • Min, Hyeon-Gyu;Kang, Dong-Joong
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.10 no.4
    • /
    • pp.50-64
    • /
    • 2018
  • This paper describes a visual object detection approach utilizing ensemble based machine learning. Object detection methods employing 1D features have the benefit of fast calculation speed. However, for real image with complex background, detection accuracy and performance are degraded. In this paper, we propose an ensemble learning algorithm that combines a 1D feature classifier and 2D DNF (Disjunctive Normal Form) classifier to improve the object detection performance in a single input image. Also, to improve the computing efficiency and accuracy, we propose a feature selecting method to reduce the computing time and ensemble algorithm by combining the 1D features and 2D DNF features. In the verification experiments, we selected the Haar-like feature as the 1D image descriptor, and demonstrated the performance of the algorithm on a few datasets such as face and vehicle.

Selecting Classifiers using Mutual Information between Classifiers (인식기 간의 상호정보를 이용한 인식기 선택)

  • Kang, Hee-Joong
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.3
    • /
    • pp.326-330
    • /
    • 2008
  • The study on combining multiple classifiers in the field of pattern recognition has mainly focused on how to combine multiple classifiers, but it has gradually turned to the study on how to select multiple classifiers from a classifier pool recently. Actually, the performance of multiple classifier system depends on the selected classifiers as well as the combination method of classifiers. Therefore, it is necessary to select a classifier set showing good performance, and an approach based on information theory has been tried to select the classifier set. In this paper, a classifier set candidate is made by the selection of classifiers, on the basis of mutual information between classifiers, and the classifier set candidate is compared with the other classifier sets chosen by the different selection methods in experiments.

Handwritten Numeral Recognition Using Karhunen-Loeve Transform Based Subspace Classifier and Combined Multiple Novelty Classifiers (Karhunen-Loeve 변환 기반의 부분공간 인식기와 결합된 다중 노벨티 인식기를 이용한 필기체 숫자 인식)

  • 임길택;진성일
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.6
    • /
    • pp.88-98
    • /
    • 1998
  • Subspace classifier is a popular pattern recognition method based on Karhunen-Loeve transform. This classifier describes a high dimensional pattern by using a reduced dimensional subspace. Because of the loss of information induced by dimensionality reduction, however, a subspace classifier sometimes shows unsatisfactory recognition performance to the patterns having quite similar principal components each other. In this paper, we propose the use of multiple novelty neural network classifiers constructed on novelty vectors to adopt minor components usually ignored and present a method of improving recognition performance through combining those with the subspace classifier. We develop the proposed classifier on handwritten numeral database and analyze its properties. Our proposed classifier shows better recognition performance compared with other classifiers, though it requires more weight links.

  • PDF

Ensemble Classifier with Negatively Correlated Features for Cancer Classification (암 분류를 위한 음의 상관관계 특징을 이용한 앙상블 분류기)

  • 원홍희;조성배
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.12
    • /
    • pp.1124-1134
    • /
    • 2003
  • The development of microarray technology has supplied a large volume of data to many fields. In particular, it has been applied to prediction and diagnosis of cancer, so that it expectedly helps us to exactly predict and diagnose cancer. It is essential to efficiently analyze DNA microarray data because the amount of DNA microarray data is usually very large. Since accurate classification of cancer is very important issue for treatment of cancer, it is desirable to make a decision by combining the results of various expert classifiers rather than by depending on the result of only one classifier. Generally combining classifiers gives high performance and high confidence. In spite of many advantages of ensemble classifiers, ensemble with mutually error-correlated classifiers has a limit in the performance. In this paper, we propose the ensemble of neural network classifiers learned from negatively correlated features using three benchmark datasets to precisely classify cancer, and systematically evaluate the performances of the proposed method. Experimental results show that the ensemble classifier with negatively correlated features produces the best recognition rate on the three benchmark datasets.

The Design of a Classifier Combining GA-based Feature Weighting Algorithm and Modified KNN Rule (GA를 이용한 특징 가중치 알고리즘과 Modified KNN규칙을 결합한 Classifier 설계)

  • Lee, Hee-Sung;Kim, Eun-Tai;Park, Mig-Non
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.162-164
    • /
    • 2004
  • This paper proposes a new classification system combining the adaptive feature weighting algorithm using the genetic algorithm and the modified KNN rule. GA is employed to choose the middle value of weights and weights of features for high performance of the system. The modified KNN rule is proposed to estimate the class of test pattern using adaptive feature space. Experiments with the unconstrained handwritten digit database of Concordia University in Canada are conducted to show the performance of the proposed method.

  • PDF

A Meta-learning Approach for Building Multi-classifier Systems in a GA-based Inductive Learning Environment (유전 알고리즘 기반 귀납적 학습 환경에서 다중 분류기 시스템의 구축을 위한 메타 학습법)

  • Kim, Yeong-Joon;Hong, Chul-Eui
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.1
    • /
    • pp.35-40
    • /
    • 2015
  • The paper proposes a meta-learning approach for building multi-classifier systems in a GA-based inductive learning environment. In our meta-learning approach, a classifier consists of a general classifier and a meta-classifier. We obtain a meta-classifier from classification results of its general classifier by applying a learning algorithm to them. The role of the meta-classifier is to evaluate the classification result of its general classifier and decide whether to participate into a final decision-making process or not. The classification system draws a decision by combining classification results that are evaluated as correct ones by meta-classifiers. We present empirical results that evaluate the effect of our meta-learning approach on the performance of multi-classifier systems.

Coarse-to-fine Classifier Ensemble Selection using Clustering and Genetic Algorithms (군집화와 유전 알고리즘을 이용한 거친-섬세한 분류기 앙상블 선택)

  • Kim, Young-Won;Oh, Il-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.9
    • /
    • pp.857-868
    • /
    • 2007
  • The good classifier ensemble should have a high complementarity among classifiers in order to produce a high recognition rate and its size is small in order to be efficient. This paper proposes a classifier ensemble selection algorithm with coarse-to-fine stages. for the algorithm to be successful, the original classifier pool should be sufficiently diverse. This paper produces a large classifier pool by combining several different classification algorithms and lots of feature subsets. The aim of the coarse selection is to reduce the size of classifier pool with little sacrifice of recognition performance. The fine selection finds near-optimal ensemble using genetic algorithms. A hybrid genetic algorithm with improved searching capability is also proposed. The experimentation uses the worldwide handwritten numeral databases. The results showed that the proposed algorithm is superior to the conventional ones.

A Genetic Algorithm-based Classifier Ensemble Optimization for Activity Recognition in Smart Homes

  • Fatima, Iram;Fahim, Muhammad;Lee, Young-Koo;Lee, Sungyoung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.11
    • /
    • pp.2853-2873
    • /
    • 2013
  • Over the last few years, one of the most common purposes of smart homes is to provide human centric services in the domain of u-healthcare by analyzing inhabitants' daily living. Currently, the major challenges in activity recognition include the reliability of prediction of each classifier as they differ according to smart homes characteristics. Smart homes indicate variation in terms of performed activities, deployed sensors, environment settings, and inhabitants' characteristics. It is not possible that one classifier always performs better than all the other classifiers for every possible situation. This observation has motivated towards combining multiple classifiers to take advantage of their complementary performance for high accuracy. Therefore, in this paper, a method for activity recognition is proposed by optimizing the output of multiple classifiers with Genetic Algorithm (GA). Our proposed method combines the measurement level output of different classifiers for each activity class to make up the ensemble. For the evaluation of the proposed method, experiments are performed on three real datasets from CASAS smart home. The results show that our method systematically outperforms single classifier and traditional multiclass models. The significant improvement is achieved from 0.82 to 0.90 in the F-measures of recognized activities as compare to existing methods.