Browse > Article

Welfare Interface using Multiple Facial Features Tracking  

Ju, Jin-Sun (Dept. of advanced technology fusion, school of Internet and multimedia Eng. Konkuk University)
Shin, Yun-Hee (Dept. of advanced technology fusion, school of Internet and multimedia Eng. Konkuk University)
Kim, Eun-Yi (Internet and multimedia Eng. Konkuk University)
Publication Information
Abstract
We propose a welfare interface using multiple fecial features tracking, which can efficiently implement various mouse operations. The proposed system consist of five modules: face detection, eye detection, mouth detection, facial feature tracking, and mouse control. The facial region is first obtained using skin-color model and connected-component analysis(CCs). Thereafter the eye regions are localized using neutral network(NN)-based texture classifier that discriminates the facial region into eye class and non-eye class, and then mouth region is localized using edge detector. Once eye and mouth regions are localized they are continuously and correctly tracking by mean-shift algorithm and template matching, respectively. Based on the tracking results, mouse operations such as movement or click are implemented. To assess the validity of the proposed system, it was applied to the interface system for web browser and was tested on a group of 25 users. The results show that our system have the accuracy of 99% and process more than 21 frame/sec on PC for the $320{\times}240$ size input image, as such it can supply a user-friendly and convenient access to a computer in real-time operation.
Keywords
Intelligent interface; welfare interface; facial feature detection; facial feature recognition;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Sharma, R., Pavlovic, V.I., Huang, T.S.: Toward multimodal human-computer interface. Proceedings of the IEEE , volume 86 , pp. 853 - 869 (1998)
2 Scassellati, Brian.: Eye finding via face detection for a foveated, active vision system. American Association for Artificial Intelligence. (1998)
3 Chan a. d. c., Englehart K., Hudgins B., and Lovely D. F.: Hidden markov model classification of myoeletric signals in speech. IEEE Engineering in Medicine and Biology Magazine, volume 21, no. 4, pp. 143-146 (2002)   DOI   ScienceOn
4 Takami, O., Morimoto, K., Ochiai, T., Ishimatsu, T.. Computer Interface to Use Head and Eyeball Movement for Handicapped People. IEEE Conf. Systems, Man and Cybernetics. pp. 1119-1123 (1995)
5 W. G. Jeon and Y. S. Cho, "An equalization technique for OFDM and MC-CDMA in a multipath fading channels," in Proc. of IEEE Conf. on Acoustics, Speech and Signal Processing, pp. 2529-2532, Munich, Germany, May 1997
6 Gary R. Bradski and Vadim Pisarevsky : Intel's Computer Vision Library: applications in calibration, stereo segmentation, tracking, gesture, face and object recognition. IEEE Conference on Computer
7 Eun Yi Kim, Sin Kuk Kang.: Eye Tracking using Neural Network and Mean-shift. LNCS, volume 3982, pp. 1200-1209 (2006)
8 Yang Jie, Yin DaQuan, Wan WeiNa, Xu XiaoXia, Wan Hui.: Real-time detecting system of the driver's fatigue. ICCE International Conference on Consumer Electronics, 2006 Digest of Technical Papers. pp. 233 - 234 (2006)
9 D. Comaniciu, V. Ramesh, and P. Meer.: Kernel-Based Object Tracking. IEEE Trans. Pattern Analysis and Machine Intelligence, volume 25, no. 5, pp. 564-577 (2003)   DOI   ScienceOn
10 Michael J. Lyons.: Facial Gesture Interfaces for Expression and Communication. IEEE International Conference on Systems, Man, Cybernetics, volume 1, pp. 598-603. (2004)
11 Schiele, Bernet., Waibel, Alex.: Gaze Tracking Based on Face-Color. School of Computer Science. Carnegie Mello University (1995)
12 Yunhee Shin, and Eun Yi Kim : Welfare Interfa ce Using Multiple facial features tracking. LNAI 4304. pp. 342-462 (2006)