Browse > Article
http://dx.doi.org/10.7232/IEIF.2011.24.2.097

Feature Selecting and Classifying Integrated Neural Network Algorithm for Multi-variate Classification  

Yoon, Hyun-Soo (School of Industrial Management Engineering, Korea University)
Baek, Jun-Geol (School of Industrial Management Engineering, Korea University)
Publication Information
IE interfaces / v.24, no.2, 2011 , pp. 97-104 More about this Journal
Abstract
Research for multi-variate classification has been studied through two kinds of procedures which are feature selection and classification. Feature Selection techniques have been applied to select important features and the other one has improved classification performances through classifier applications. In general, each technique has been independently studied, however consideration of the interaction between both procedures has not been widely explored which leads to a degraded performance. In this paper, through integrating these two procedures, classification performance can be improved. The proposed model takes advantage of KBANN (Knowledge-Based Artificial Neural Network) which uses prior knowledge to learn NN (Neural Network) as training information. Each NN learns characteristics of the Feature Selection and Classification techniques as training sets. The integrated NN can be learned again to modify features appropriately and enhance classification performance. This innovative technique is called ALBNN (Algorithm Learning-Based Neural Network). The experiments' results show improved performance in various classification problems.
Keywords
classification; feature selection; data mining; neural network; KBANN; multi-variate analysis;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Gori, M. and Tesi, A. (1994), On the Problem of Local Minima in Backpropagation, IEEE Transactions on Pattern analysis and Machine Intelligence, 14(1), 76-86.
2 Hagan, M. T. and Menhaj, M. B. (1994), Training feedforward networks with the marquardt algorithm, IEEE Transactions on Neural Networks, 5(6), 989-993.   DOI   ScienceOn
3 Kabir, M. M. and Islam, M. M., Murase, K. (2010), A new wrapper feature selection approach using neural network, Neruocomputing(Article in Process).
4 Marquardt, D. W. (1963), An algorithm for least squares estimation of nonlinear parameters, Journal of Society for Industrial and Applied Mathematics, 11(2), 431-441.   DOI   ScienceOn
5 Park, M. S. and Choi, J. Y. (2009), Theoretical analysis on feature extraction capability of class-augmented PCA, Journal of Pattern Recognition, 42, 2353-2362.   DOI   ScienceOn
6 Pudil, P., Novovicova, J. and Kittler, J. (1994), Floating search methods in feature selection, Pattern Recognition Letters, 15(11), 1119-1125.   DOI   ScienceOn
7 Saeys, Y., Inza, I., and Larranaga, Y, P. (2007), A review of feature selection techinques in bioinformatics Bioinformatics, 23(19), 2507-2517.   DOI   ScienceOn
8 Sarkar, I., I. Sarkara, N., Planetb, P. J., Baelc, T. E., Stanleyd, S. E., Siddalle, M., and DeSalle, R. (2002), Characteristic attributes in cancer microarrays, Journal of Biomedical Informatics, 35(2), 111-122.   DOI   ScienceOn
9 Turk, M. and Pentland, A. (1991), Eigenfaces for recognitions, Journal of Cognitive Neuroscience, 3, 71-86.   DOI   ScienceOn
10 Towell, G. G. and Shavlik, J. W. (1994), Knowledge based artificial neural networks, Artificial Intelligence, 70(1), 119-165.   DOI
11 Yves, C., David E. Rumelhart, A. (1995), Back Propagation : theory, architecture, and applications, Lawrence Erlbaum Associates, New Jersey, USA.
12 Burges, C. J. C. (1998), A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 2, 121-167.   DOI   ScienceOn
13 Cortes, C. and Vapnik V. (1995), Support vector network, Machine Learning, 20, 273-297.