• Title/Summary/Keyword: Classifier Selection

Search Result 247, Processing Time 0.032 seconds

Detection for JPEG steganography based on evolutionary feature selection and classifier ensemble selection

  • Ma, Xiaofeng;Zhang, Yi;Song, Xiangfeng;Fan, Chao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.11
    • /
    • pp.5592-5609
    • /
    • 2017
  • JPEG steganography detection is an active research topic in the field of information hiding due to the wide use of JPEG image in social network, image-sharing websites, and Internet communication, etc. In this paper, a new steganalysis method for content-adaptive JPEG steganography is proposed by integrating the evolutionary feature selection and classifier ensemble selection. First, the whole framework of the proposed steganalysis method is presented and then the characteristic of the proposed method is analyzed. Second, the feature selection method based on genetic algorithm is given and the implement process is described in detail. Third, the method of classifier ensemble selection is proposed based on Pareto evolutionary optimization. The experimental results indicate the proposed steganalysis method can achieve a competitive detection performance by compared with the state-of-the-art steganalysis methods when used for the detection of the latest content-adaptive JPEG steganography algorithms.

Construction of Multiple Classifier Systems based on a Classifiers Pool (인식기 풀 기반의 다수 인식기 시스템 구축방법)

  • Kang, Hee-Joong
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.8
    • /
    • pp.595-603
    • /
    • 2002
  • Only a few studies have been conducted on how to select multiple classifiers from the pool of available classifiers for showing the good classification performance. Thus, the selection problem if classifiers on how to select or how many to select still remains an important research issue. In this paper, provided that the number of selected classifiers is constrained in advance, a variety of selection criteria are proposed and applied to tile construction of multiple classifier systems, and then these selection criteria will be evaluated by the performance of the constructed multiple classifier systems. All the possible sets of classifiers are trammed by the selection criteria, and some of these sets are selected as the candidates of multiple classifier systems. The multiple classifier system candidates were evaluated by the experiments recognizing unconstrained handwritten numerals obtained both from Concordia university and UCI machine learning repository. Among the selection criteria, particularly the multiple classifier system candidates by the information-theoretic selection criteria based on conditional entropy showed more promising results than those by the other selection criteria.

Coarse-to-fine Classifier Ensemble Selection using Clustering and Genetic Algorithms (군집화와 유전 알고리즘을 이용한 거친-섬세한 분류기 앙상블 선택)

  • Kim, Young-Won;Oh, Il-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.9
    • /
    • pp.857-868
    • /
    • 2007
  • The good classifier ensemble should have a high complementarity among classifiers in order to produce a high recognition rate and its size is small in order to be efficient. This paper proposes a classifier ensemble selection algorithm with coarse-to-fine stages. for the algorithm to be successful, the original classifier pool should be sufficiently diverse. This paper produces a large classifier pool by combining several different classification algorithms and lots of feature subsets. The aim of the coarse selection is to reduce the size of classifier pool with little sacrifice of recognition performance. The fine selection finds near-optimal ensemble using genetic algorithms. A hybrid genetic algorithm with improved searching capability is also proposed. The experimentation uses the worldwide handwritten numeral databases. The results showed that the proposed algorithm is superior to the conventional ones.

Optimal k-Nearest Neighborhood Classifier Using Genetic Algorithm (유전알고리즘을 이용한 최적 k-최근접이웃 분류기)

  • Park, Chong-Sun;Huh, Kyun
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.1
    • /
    • pp.17-27
    • /
    • 2010
  • Feature selection and feature weighting are useful techniques for improving the classification accuracy of k-Nearest Neighbor (k-NN) classifier. The main propose of feature selection and feature weighting is to reduce the number of features, by eliminating irrelevant and redundant features, while simultaneously maintaining or enhancing classification accuracy. In this paper, a novel hybrid approach is proposed for simultaneous feature selection, feature weighting and choice of k in k-NN classifier based on Genetic Algorithm. The results have indicated that the proposed algorithm is quite comparable with and superior to existing classifiers with or without feature selection and feature weighting capability.

Selecting Classifiers using Mutual Information between Classifiers (인식기 간의 상호정보를 이용한 인식기 선택)

  • Kang, Hee-Joong
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.3
    • /
    • pp.326-330
    • /
    • 2008
  • The study on combining multiple classifiers in the field of pattern recognition has mainly focused on how to combine multiple classifiers, but it has gradually turned to the study on how to select multiple classifiers from a classifier pool recently. Actually, the performance of multiple classifier system depends on the selected classifiers as well as the combination method of classifiers. Therefore, it is necessary to select a classifier set showing good performance, and an approach based on information theory has been tried to select the classifier set. In this paper, a classifier set candidate is made by the selection of classifiers, on the basis of mutual information between classifiers, and the classifier set candidate is compared with the other classifier sets chosen by the different selection methods in experiments.

Prototype-based Classifier with Feature Selection and Its Design with Particle Swarm Optimization: Analysis and Comparative Studies

  • Park, Byoung-Jun;Oh, Sung-Kwun
    • Journal of Electrical Engineering and Technology
    • /
    • v.7 no.2
    • /
    • pp.245-254
    • /
    • 2012
  • In this study, we introduce a prototype-based classifier with feature selection that dwells upon the usage of a biologically inspired optimization technique of Particle Swarm Optimization (PSO). The design comprises two main phases. In the first phase, PSO selects P % of patterns to be treated as prototypes of c classes. During the second phase, the PSO is instrumental in the formation of a core set of features that constitute a collection of the most meaningful and highly discriminative coordinates of the original feature space. The proposed scheme of feature selection is developed in the wrapper mode with the performance evaluated with the aid of the nearest prototype classifier. The study offers a complete algorithmic framework and demonstrates the effectiveness (quality of solution) and efficiency (computing cost) of the approach when applied to a collection of selected data sets. We also include a comparative study which involves the usage of genetic algorithms (GAs). Numerical experiments show that a suitable selection of prototypes and a substantial reduction of the feature space could be accomplished and the classifier formed in this manner becomes characterized by low classification error. In addition, the advantage of the PSO is quantified in detail by running a number of experiments using Machine Learning datasets.

Hybrid Genetic Algorithm for Classifier Ensemble Selection (분류기 앙상블 선택을 위한 혼합 유전 알고리즘)

  • Kim, Young-Won;Oh, Il-Seok
    • The KIPS Transactions:PartB
    • /
    • v.14B no.5
    • /
    • pp.369-376
    • /
    • 2007
  • This paper proposes a hybrid genetic algorithm(HGA) for the classifier ensemble selection. HGA is added a local search operation for increasing the fine-turning of local area. This paper apply hybrid and simple genetic algorithms(SGA) to the classifier ensemble selection problem in order to show the superiority of HGA. And this paper propose two methods(SSO: Sequential Search Operations, CSO: Combinational Search Operations) of local search operation of hybrid genetic algorithm. Experimental results show that the HGA has better searching capability than SGA. The experiments show that the CSO considering the correlation among classifiers is better than the SSO.

Design of Lazy Classifier based on Fuzzy k-Nearest Neighbors and Reconstruction Error (퍼지 k-Nearest Neighbors 와 Reconstruction Error 기반 Lazy Classifier 설계)

  • Roh, Seok-Beom;Ahn, Tae-Chon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.1
    • /
    • pp.101-108
    • /
    • 2010
  • In this paper, we proposed a new lazy classifier with fuzzy k-nearest neighbors approach and feature selection which is based on reconstruction error. Reconstruction error is the performance index for locally linear reconstruction. When a new query point is given, fuzzy k-nearest neighbors approach defines the local area where the local classifier is available and assigns the weighting values to the data patterns which are involved within the local area. After defining the local area and assigning the weighting value, the feature selection is carried out to reduce the dimension of the feature space. When some features are selected in terms of the reconstruction error, the local classifier which is a sort of polynomial is developed using weighted least square estimation. In addition, the experimental application covers a comparative analysis including several previously commonly encountered methods such as standard neural networks, support vector machine, linear discriminant analysis, and C4.5 trees.

A Study on the Application of Digital Signal Processing for Pattern Recognition of Microdefects (미소결함의 형상인식을 위한 디지털 신호처리 적용에 관한 연구)

  • 홍석주
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.9 no.1
    • /
    • pp.119-127
    • /
    • 2000
  • In this study the classified researches the artificial and natural flaws in welding parts are performed using the pattern recognition technology. For this purpose the signal pattern recognition package including the user defined function was developed and the total procedure including the digital signal processing feature extraction feature selection and classifi-er selection is teated by bulk,. Specially it is composed with and discussed using the statistical classifier such as the linear discriminant function the empirical Bayesian classifier. Also the pattern recognition technology is applied to classifica-tion problem of natural flaw(i.e multiple classification problem-crack lack of penetration lack of fusion porosity and slag inclusion the planar and volumetric flaw classification problem), According to this result it is possible to acquire the recognition rate of 83% above even through it is different a little according to domain extracting the feature and the classifier.

  • PDF

A Multiple Classifier System based on Dynamic Classifier Selection having Local Property (지역적 특성을 갖는 동적 선택 방법에 기반한 다중 인식기 시스템)

  • 송혜정;김백섭
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.3_4
    • /
    • pp.339-346
    • /
    • 2003
  • This paper proposes a multiple classifier system having massive micro classifiers. The micro classifiers are trained by using a local set of training patterns. The k nearest neighboring training patterns of one training pattern comprise the local region for training a micro classifier. Each training pattern is incorporated with one or more micro classifiers. Two types of micro classifiers are adapted in this paper. SVM with linear kernel and SVM with RBF kernel. Classification is done by selecting the best micro classifier among the micro classifiers in vicinity of incoming test pattern. To measure the goodness of each micro classifier, the weighted sum of correctly classified training patterns in vicinity of the test pattern is used. Experiments have been done on Elena database. Results show that the proposed method gives better classification accuracy than any conventional classifiers like SVM, k-NN and the conventional classifier combination/selection scheme.