Browse > Article
http://dx.doi.org/10.3745/KTSDE.2013.2.1.043

Learning Algorithm for Multiple Distribution Data using Haar-like Feature and Decision Tree  

Kwak, Ju-Hyun (건국대학교 컴퓨터공학과)
Woen, Il-Young (호서서울전문학교 사이버해킹보안과)
Lee, Chang-Hoon (건국대학교 컴퓨터공학과)
Publication Information
KIPS Transactions on Software and Data Engineering / v.2, no.1, 2013 , pp. 43-48 More about this Journal
Abstract
Adaboost is widely used for Haar-like feature boosting algorithm in Face Detection. It shows very effective performance on single distribution model. But when detecting front and side face images at same time, Adaboost shows it's limitation on multiple distribution data because it uses linear combination of basic classifier. This paper suggest the HDCT, modified decision tree algorithm for Haar-like features. We still tested the performance of HDCT compared with Adaboost on multiple distributed image recognition.
Keywords
Adaboost; Haar-like; Decision Tree; Pattern Recognition;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Carbonell, J.G., Michalski, R.S., & Mitchell, T.M. (1983). An overview of machine learning, In R.S. Michalski, J.G. Carbonell and T.M. Mitchell, (Eds.), Machine learning: An artificial intelligence approach. Palo Alto: Tioga Publishing Company.
2 Pearl, J. (1978a). Entropy, information and rational decisions (Technical report). Cognitive Systems Laboratory, University of California, Los Angeles.
3 Quinlan, J.R. (1985b). Decision trees and multi-valued attributes. In J.E. Hayes & D. Michie (Eds.), Machine intelligence 11. Oxford University Press(in press).
4 T Kolsch, Diplomarbeit Im Fach Informatik, Lehrstuhl Fur Informatik Vi, Rheinisch-westfalische Technische, Hochschule Aachen, Prof Dr. -ing, Prof Dr. -ing, H. Ney, H. Ney, Prof Dr, E. Vidal, Dipl Inform, D. Keysers, Tobias Gabriel, Tobias Gabriel, Benedikt Kolsch, Benedikt Kolsch, Local Features for Image Classification. 2003.
5 Quinlan, J.R. (1993). Program for machine learning, Morgan Kaufmann Publisher, Inc.
6 Quinlan, J.R. (1979). Discovering rules by induction from large collections of examples. In D. Michie(Ed.), Expert systems in the micro electronic age. Edinburgh University Press.
7 J. R. Quinlan. Improved use of continuous attributes in c4.5. Journal of Artificial Intelligence Research, 4:77-90, 1996.
8 Quinlan, J.R. (1983a). Learning efficient classification procedures and their application to chess endgames. In R.S. Michalski, J.G. Carbonell & T.M. Mitchell, (Eds.), Machine learning: An artificialintelligence approach. Palo Alto: Tioga Publishing Company.
9 Y. Amit, D. Geman, and K. Wilder. Joint induction of shape features and tree classifiers, 1997.
10 P. Viola and M. Jones. Rapid object detection using a boosted cascade of simple features. In Proc. CVPR, pp.511-518, 2001.