Browse > Article
http://dx.doi.org/10.3745/KIPSTB.2010.17B.2.139

Reinforcement Post-Processing and Feedback Algorithm for Optimal Combination in Bottom-Up Hierarchical Classification  

Choi, Yun-Jeong (서일대학 정보통신과)
Park, Seung-Soo (이화여자대학교 컴퓨터공학과)
Abstract
This paper shows a reinforcement post-processing method and feedback algorithm for improvement of assigning method in classification. Especially, we focused on complex documents that are generally considered to be hard to classify. A basis factors in traditional classification system are training methodology, classification models and features of documents. The classification problem of the documents containing shared features and multiple meanings, should be deeply mined or analyzed than general formatted data. To address the problems of these document, we proposed a method to expand classification scheme using decision boundary detected automatically in our previous studies. The assigning method that a document simply decides to the top ranked category, is a main factor that we focus on. In this paper, we propose a post-processing method and feedback algorithm to analyze the relevance of ranked list. In experiments, we applied our post-processing method and one time feedback algorithm to complex documents. The experimental results show that our system does not need to change the classification algorithm itself to improve the accuracy and flexibility.
Keywords
Feature Extraction; Meta-Classification; Hierarchical Classification; Reinforcement Learning; Post-Processing; Feedback;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 T.,Joachims,“Text Categorization with Support Vector Machines: Learning with Many Relevant Features,” In Proc. of ECML-98 pp.137-142, 1998.
2 K.,A.,Kofahi, and A., Tyrrell, et.al, “Combining Multiple Classifiers for Text Categorization,” In Proc. of ACM CIKM, pp.97-104, 2001.   DOI
3 S.B.,Kim and H.C.,Rim, “Recomputation of Class Relevance Score for Improving Text Classification,” In Proc. Conference of Computational Linguistics and Intelligent Text Processing (CICLing), LNCS, Vol.2945, pp.580-583, Feb., 2004.
4 Shanfeng Zhu and Ichigaku Takigawa et. al, “Field Independent Probabilistic Model for Clustering Multi-field Documents,” Information Processing and Management, Vol.45, pp.555-570,2009.   DOI   ScienceOn
5 Qinrong Feng, Duoqian Miao and Yi Cheng, “Hierarchical decision rules mining,” Expert Systems with Application, Vol.37, pp.2081-2091, 2010   DOI   ScienceOn
6 Nicolas Garcia-Pedrajas and Domingo Ortiz-Boyer, “Boosting k-nearest neighbor classifier by means of input space projection,” Expert Systems with Applications,Vol. 36, pp.10570-10582, 2009.   DOI   ScienceOn
7 Y.,Yang,“Expert Network:Effective and Efficient Learning form Human Decisions in Text Categorization and Retrieval,” In Proc. of 17th ACM, pp.13-22, 1994.
8 D.,R.,Wilson, et al “Reduction Techniques for Exemplarbased Learning Algorithms,” Machine Learning, Vol.38. No.3, pp.257-286, 2002.   DOI
9 D.,Koller,S.,Tong,“Active Learning for Parameter Estimation in Bayesian Networks,” In Neural Information Processing Systems, 2001.
10 D.,David, J., Catlett, “Heterogeneous Uncertainty Sampling for Supervised Learning,” In Proc. of the 11th ICML, pp.148-156, 1994.
11 T., Zhang, and F.Oles, “A Probability Analysis on the Value of Unlabeled Data for Classification Problems,” In Proc. of 17th Machine Learning (ICML), 2000.
12 David A. Bell, J. W. Guan, Yaxin B, “On Combining Classifier Mass Functions for Text Categorization”, IEEE Trans. Knowl. Data Eng. Vol.17, No.10, pp.1307-1319,2005.   DOI   ScienceOn
13 G.P. Zhang, “A Neural Network Ensemble Method with Jittered Training Data for Time Series Forecasting,” Information Sciences. Vol.177, pp.5329-5346.2007.   DOI   ScienceOn
14 최윤정,지정규,박승수, “경계범주 자동탐색에 의한 확장된 학습체계 구성방법”, 정보처리학회논문지(B), Vol. No. pp.- pp. 2009. 12.   DOI   ScienceOn
15 S. B. Cho, “Ensemble of Structure Adaptive Self-Organizing Maps for High Performance Classification,” Information Science, Vol. 123, No.1-2, pp.103-114, 2000.   DOI   ScienceOn