Browse > Article

Hand Gesture Interface Using Mobile Camera Devices  

Lee, Chan-Su (영남대학교 전자공학과)
Chun, Sung-Yong (영남대학교 전자공학과)
Sohn, Myoung-Gyu (대구경북과학기술원 미래산업융합기술연구부)
Lee, Sang-Heon (대구경북과학기술원 미래산업융합기술연구부)
Abstract
This paper presents a hand motion tracking method for hand gesture interface using a camera in mobile devices such as a smart phone and PDA. When a camera moves according to the hand gesture of the user, global optical flows are generated. Therefore, robust hand movement estimation is possible by considering dominant optical flow based on histogram analysis of the motion direction. A continuous hand gesture is segmented into unit gestures by motion state estimation using motion phase, which is determined by velocity and acceleration of the estimated hand motion. Feature vectors are extracted during movement states and hand gestures are recognized at the end state of each gesture. Support vector machine (SVM), k-nearest neighborhood classifier, and normal Bayes classifier are used for classification. SVM shows 82% recognition rate for 14 hand gestures.
Keywords
Hand gesture recognition; motion estimation; gesture interface; optical flow; mobile device;
Citations & Related Records
연도 인용수 순위
  • Reference
1 J. Lim, S. Kim, C. Lee, G. S. Lee, H. J. Yang, E. M. Lee, "Cursive Script Recognition in Wine Label Images Using Over-Segmentation and Character Combination Filtering," Proc. of the KIISE Fall Congress 2009, vol.36, no.2(A), pp.222-223, 2009. (in Korean)
2 S. Mitra, T. Acharya, "Gesture Recognition: A survey," IEEE Trans. Systems, Man and Cybernetics-Part C, vol.37, no.3, pp.311-324, May 2007.
3 M. Rohs, "Real-World Interaction with Camera- Phones," Ubiquitous Computing Systems, LNCS, vol.3598, pp.74-89, 2005.
4 S. Winkler, K. Rangaswamy, Z.Y. Zhou, "Intuitive user interface for mobile devices based on visual motion detection," Proc. SPIE, Multimedia on Mobile Device, vol.6507, pp.65070V, 2007.
5 J. Hwang, G. J. Kim, N. Kim, "Camera based relative motion tracking for hand-held virtual reality," In Proc. NICOGRAPH International, 2006.
6 M Barnard, J. Hannuksela, P. Sangi, J. Heikkila, "A vision based motion interface for mobile phones," In Proc. of International Conference on Computer Vision Systems, 2007.
7 A. Haro, K. Mori, T. Capin, S. Wilkinson, "Mobile camera-based user interaction," LNCS 3766, Computer Vision in Human-Computer Interaction, pp.79-89, 2005.
8 M. S. Ko, K. H. Lee, C. W. Kim, J. H. Ahn, I. J. Kim, "An Implementation of User Interface Using Vision-based Gesture Recognition," Proc. of the KIISE Korea Computer Congress, vol.35, no.1(C), pp.507-511, 2008.
9 J. Shi and C. Tomasi, "Good feature to track," In Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp.594-600, June 1994.
10 B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereo vision," In Proc. Image Understanding Workshop, pp.121-130, 1981.
11 A. Kendon, "Gesticulation and speech: two aspects of the process of utterance," The relationship between verbal and nonverbal communication, pp.207-227, 1980.
12 Chan-Su Lee, Gyu-tae Park, Jong-Sung Kim, Zeungnam Bien, Won Jang, Sung-Kwon Kim, "Real-time Recognition System of Korean Sign Language based on Elementary Components," IEEE FUZZ'97, pp.1463- 1468, 1997.
13 H. Sung, H. Byun, "3D face tracking using MLESAC motion estimation based particle filter," Proc. of the KIISE Fall Congress 2009, vol.36, no.2(A), pp.214- 215, 2009. (in Korean)