• Title/Summary/Keyword: Feature selection optimization

Search Result 94, Processing Time 0.02 seconds

Feature Selection Method by Information Theory and Particle S warm Optimization (상호정보량과 Binary Particle Swarm Optimization을 이용한 속성선택 기법)

  • Cho, Jae-Hoon;Lee, Dae-Jong;Song, Chang-Kyu;Chun, Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.2
    • /
    • pp.191-196
    • /
    • 2009
  • In this paper, we proposed a feature selection method using Binary Particle Swarm Optimization(BPSO) and Mutual information. This proposed method consists of the feature selection part for selecting candidate feature subset by mutual information and the optimal feature selection part for choosing optimal feature subset by BPSO in the candidate feature subsets. In the candidate feature selection part, we computed the mutual information of all features, respectively and selected a candidate feature subset by the ranking of mutual information. In the optimal feature selection part, optimal feature subset can be found by BPSO in the candidate feature subset. In the BPSO process, we used multi-object function to optimize both accuracy of classifier and selected feature subset size. DNA expression dataset are used for estimating the performance of the proposed method. Experimental results show that this method can achieve better performance for pattern recognition problems than conventional ones.

A Study on Feature Selection in Face Image Using Principal Component Analysis and Particle Swarm Optimization Algorithm (PCA와 입자 군집 최적화 알고리즘을 이용한 얼굴이미지에서 특징선택에 관한 연구)

  • Kim, Woong-Ki;Oh, Sung-Kwun;Kim, Hyun-Ki
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.58 no.12
    • /
    • pp.2511-2519
    • /
    • 2009
  • In this paper, we introduce the methodological system design via feature selection using Principal Component Analysis and Particle Swarm Optimization algorithms. The overall methodological system design comes from three kinds of modules such as preprocessing module, feature extraction module, and recognition module. First, Histogram equalization enhance the quality of image by exploiting contrast effect based on the normalized function generated from histogram distribution values of 2D face image. Secondly, PCA extracts feature vectors to be used for face recognition by using eigenvalues and eigenvectors obtained from covariance matrix. Finally the feature selection for face recognition among the entire feature vectors is considered by means of the Particle Swarm Optimization. The optimized Polynomial-based Radial Basis Function Neural Networks are used to evaluate the face recognition performance. This study shows that the proposed methodological system design is effective to the analysis of preferred face recognition.

Prototype-based Classifier with Feature Selection and Its Design with Particle Swarm Optimization: Analysis and Comparative Studies

  • Park, Byoung-Jun;Oh, Sung-Kwun
    • Journal of Electrical Engineering and Technology
    • /
    • v.7 no.2
    • /
    • pp.245-254
    • /
    • 2012
  • In this study, we introduce a prototype-based classifier with feature selection that dwells upon the usage of a biologically inspired optimization technique of Particle Swarm Optimization (PSO). The design comprises two main phases. In the first phase, PSO selects P % of patterns to be treated as prototypes of c classes. During the second phase, the PSO is instrumental in the formation of a core set of features that constitute a collection of the most meaningful and highly discriminative coordinates of the original feature space. The proposed scheme of feature selection is developed in the wrapper mode with the performance evaluated with the aid of the nearest prototype classifier. The study offers a complete algorithmic framework and demonstrates the effectiveness (quality of solution) and efficiency (computing cost) of the approach when applied to a collection of selected data sets. We also include a comparative study which involves the usage of genetic algorithms (GAs). Numerical experiments show that a suitable selection of prototypes and a substantial reduction of the feature space could be accomplished and the classifier formed in this manner becomes characterized by low classification error. In addition, the advantage of the PSO is quantified in detail by running a number of experiments using Machine Learning datasets.

Sparse and low-rank feature selection for multi-label learning

  • Lim, Hyunki
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.7
    • /
    • pp.1-7
    • /
    • 2021
  • In this paper, we propose a feature selection technique for multi-label classification. Many existing feature selection techniques have selected features by calculating the relation between features and labels such as a mutual information scale. However, since the mutual information measure requires a joint probability, it is difficult to calculate the joint probability from an actual premise feature set. Therefore, it has the disadvantage that only a few features can be calculated and only local optimization is possible. Away from this regional optimization problem, we propose a feature selection technique that constructs a low-rank space in the entire given feature space and selects features with sparsity. To this end, we designed a regression-based objective function using Nuclear norm, and proposed an algorithm of gradient descent method to solve the optimization problem of this objective function. Based on the results of multi-label classification experiments on four data and three multi-label classification performance, the proposed methodology showed better performance than the existing feature selection technique. In addition, it was showed by experimental results that the performance change is insensitive even to the parameter value change of the proposed objective function.

Use of Artificial Bee Swarm Optimization (ABSO) for Feature Selection in System Diagnosis for Coronary Heart Disease

  • Wiharto;Yaumi A. Z. A. Fajri;Esti Suryani;Sigit Setyawan
    • Journal of information and communication convergence engineering
    • /
    • v.21 no.2
    • /
    • pp.130-138
    • /
    • 2023
  • The selection of the correct examination variables for diagnosing heart disease provides many benefits, including faster diagnosis and lower cost of examination. The selection of inspection variables can be performed by referring to the data of previous examination results so that future investigations can be carried out by referring to these selected variables. This paper proposes a model for selecting examination variables using an Artificial Bee Swarm Optimization method by considering the variables of accuracy and cost of inspection. The proposed feature selection model was evaluated using the performance parameters of accuracy, area under curve (AUC), number of variables, and inspection cost. The test results show that the proposed model can produce 24 examination variables and provide 95.16% accuracy and 97.61% AUC. These results indicate a significant decrease in the number of inspection variables and inspection costs while maintaining performance in the excellent category.

Ant Colony Optimization for Feature Selection in Pattern Recognition (패턴 인식에서 특징 선택을 위한 개미 군락 최적화)

  • Oh, Il-Seok;Lee, Jin-Seon
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.5
    • /
    • pp.1-9
    • /
    • 2010
  • This paper propose a novel scheme called selective evaluation to improve convergence of ACO (ant colony optimization) for feature selection. The scheme cutdown the computational load by excluding the evaluation of unnecessary or less promising candidate solutions. The scheme is realizable in ACO due to the valuable information, pheromone trail which helps identify those solutions. With the aim of checking applicability of algorithms according to problem size, we analyze the timing requirements of three popular feature selection algorithms, greedy algorithm, genetic algorithm, and ant colony optimization. For a rigorous timing analysis, we adopt the concept of atomic operation. Experimental results showed that the ACO with selective evaluation was promising both in timing requirement and recognition performance.

Zero-Stress Member Selection for Sizing Optimization of Truss Structures (트러스 구조물 사이즈 최적화를 위한 무응력 부재의 선택)

  • Lee, Seunghye;Lee, Jonghyun;Lee, Kihak;Lee, Jaehong
    • Journal of Korean Association for Spatial Structures
    • /
    • v.21 no.1
    • /
    • pp.61-70
    • /
    • 2021
  • This paper describes a novel zero-stress member selecting method for sizing optimization of truss structures. When a sizing optimization method with static constraints is implemented, the member stresses are affected sensitively with changing the variables. However, because some truss members are unaffected by specific loading cases, zero-stress states are experienced by the elements. The zero-stress members could affect the computational cost and time of sizing optimization processes. Feature selection approaches can be then used to eliminate the zero-stress member from the whole variables prior to the process of optimization. Several numerical truss examples are tested using the proposed methods.

Improved Feature Selection Techniques for Image Retrieval based on Metaheuristic Optimization

  • Johari, Punit Kumar;Gupta, Rajendra Kumar
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.1
    • /
    • pp.40-48
    • /
    • 2021
  • Content-Based Image Retrieval (CBIR) system plays a vital role to retrieve the relevant images as per the user perception from the huge database is a challenging task. Images are represented is to employ a combination of low-level features as per their visual content to form a feature vector. To reduce the search time of a large database while retrieving images, a novel image retrieval technique based on feature dimensionality reduction is being proposed with the exploit of metaheuristic optimization techniques based on Genetic Algorithm (GA), Extended Binary Cuckoo Search (EBCS) and Whale Optimization Algorithm (WOA). Each image in the database is indexed using a feature vector comprising of fuzzified based color histogram descriptor for color and Median binary pattern were derived in the color space from HSI for texture feature variants respectively. Finally, results are being compared in terms of Precision, Recall, F-measure, Accuracy, and error rate with benchmark classification algorithms (Linear discriminant analysis, CatBoost, Extra Trees, Random Forest, Naive Bayes, light gradient boosting, Extreme gradient boosting, k-NN, and Ridge) to validate the efficiency of the proposed approach. Finally, a ranking of the techniques using TOPSIS has been considered choosing the best feature selection technique based on different model parameters.

Compiler Analysis Framework Using SVM-Based Genetic Algorithm : Feature and Model Selection Sensitivity (SVM 기반 유전 알고리즘을 이용한 컴파일러 분석 프레임워크 : 특징 및 모델 선택 민감성)

  • Hwang, Cheol-Hun;Shin, Gun-Yoon;Kim, Dong-Wook;Han, Myung-Mook
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.4
    • /
    • pp.537-544
    • /
    • 2020
  • Advances in detection techniques, such as mutation and obfuscation, are being advanced with the development of malware technology. In the malware detection technology, unknown malware detection technology is important, and a method for Malware Authorship Attribution that detects an unknown malicious code by identifying the author through distributed malware is being studied. In this paper, we try to extract the compiler information affecting the binary-based author identification method and to investigate the sensitivity of feature selection, probability and non-probability models, and optimization to classification efficiency between studies. In the experiment, the feature selection method through information gain and the support vector machine, which is a non-probability model, showed high efficiency. Among the optimization studies, high classification accuracy was obtained through feature selection and model optimization through the proposed framework, and resulted in 48% feature reduction and 53 faster execution speed. Through this study, we can confirm the sensitivity of feature selection, model, and optimization methods to classification efficiency.

Genetic Algorithm Based Feature Selection Method Development for Pattern Recognition (패턴 인식문제를 위한 유전자 알고리즘 기반 특징 선택 방법 개발)

  • Park Chang-Hyun;Kim Ho-Duck;Yang Hyun-Chang;Sim Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.4
    • /
    • pp.466-471
    • /
    • 2006
  • IAn important problem of pattern recognition is to extract or select feature set, which is included in the pre-processing stage. In order to extract feature set, Principal component analysis has been usually used and SFS(Sequential Forward Selection) and SBS(Sequential Backward Selection) have been used as a feature selection method. This paper applies genetic algorithm which is a popular method for nonlinear optimization problem to the feature selection problem. So, we call it Genetic Algorithm Feature Selection(GAFS) and this algorithm is compared to other methods in the performance aspect.