Browse > Article

Fingertip Extraction and Hand Motion Recognition Method for Augmented Reality Applications  

Lee, Jeong-Jin (가톨릭대학교 디지털미디어학부)
Kim, Jong-Ho (서경대학교 컴퓨터공학과)
Kim, Tae-Young (서경대학교 컴퓨터공학과)
Publication Information
Abstract
In this paper, we propose fingertip extraction and hand motion recognition method for augmented reality applications. First, an input image is transformed into HSV color space from RGB color space. A hand area is segmented using double thresholding of H, S value, region growing, and connected component analysis. Next, the end points of the index finger and thumb are extracted using morphology operation and subtraction for a virtual keyboard and mouse interface. Finally, the angle between the end points of the index finger and thumb with respect to the center of mass point of the palm is calculated to detect the touch between the index finger and thumb for implementing the click of a mouse button. Experimental results on various input images showed that our method segments the hand, fingertips, and recognizes the movements of the hand fast and accurately. Proposed methods can be used the input interface for augmented reality applications.
Keywords
Fingertip Extraction; Hand Motion Recognition; Hand Interface; Augmented Reality Application;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 김성진, 김태영, 임철수, "발달장애인을 위한 혼합현실 기반 상황훈련 시스템," 한국컴퓨터그래픽스학회논문지, Vol.14, No.2, pp. 1-8, 2008.   과학기술학회마을
2 R. C. GonzaIez, R. E. Woods, and S. L. Eddins, Digital image processing using matlab, Prentice-Hall, 2004.
3 Y. S. Kim, B. S. Soh, and S. G. Lee, "A new wearable input device: SCURRY," IEEE Transactions on Industrial Electronics, Vol.52, No.6, pp. 1490-1499, 2005   DOI   ScienceOn
4 S. Malik, C. McDonald, and G. Roth, "Hand tracking for interactive pattern-based augmented reality," Proceedings of International Symposium on Mixed and Augmented Reality, pp. 117-126, 2002.
5 W. Liang, Y. Jia, F. Sun, B. Ning, T. Liu, and X. Wu, "Visual Hand Tracking Using MDSA Method," IMACS Multiconference on Computational Engineering in Systems Applications, pp. 255-259, 2006.
6 T. Sato, K. Fukuchi, and H. Koike, "OHAJIKI Interface: Flicking Gesture Recognition with a High-Speed Camera," Lecture Notes in Computer Science, Vol.4161, pp. 205-210, 2006.
7 T. Sato, K. Fukuchi, and H. Koike, "Camera-based Flicking Gesture Recognition and Game Applications," Adjunct Proceedings of the 19th annual ACM Symposium on User Interface Software and Technology, 2006.
8 한갑종, 황재인, 최승문, 김정현, "증강현실 기반의 3차원 도자기 모델링 시스템," 한국 HCI학회논문지, 2권, 2호, pp. 19-26, 2007.
9 D. Wagner, T. Pintaric, and D. Schmalstieg, "The invisible train: a collaborative handheld augmented reality demonstrator," Proceedings of ACM SIGGRAPH 2004 Emerging Technologies, pp. 12, 2004.
10 D. Bandyopadhyay, R. Raskar, and H. Fuchs, "Dynamic shader lamps : painting on movable objects," Proceedings of IEEE and ACM International Symposium on Augmented Reality, pp. 207-216, 2001.
11 하태진, 우운택, "Video see-through HMD 기반 증강현실을 위한 손 인터페이스," 한국 HCI학회, pp. 169-174, 2006.
12 F. Chang, C. J. Chen, and C. J. Lu, "A linear-time component-labeling algorithm using contour tracing technique," Computer Vision and Image Understanding, Vol.93, No.2, pp. 206-220, 2004.   DOI   ScienceOn