• Title/Summary/Keyword: 손 동작 추적

Search Result 48, Processing Time 0.049 seconds

A Hand Gesture Recognition System using 3D Tracking Volume Restriction Technique (3차원 추적영역 제한 기법을 이용한 손 동작 인식 시스템)

  • Kim, Kyung-Ho;Jung, Da-Un;Lee, Seok-Han;Choi, Jong-Soo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.201-211
    • /
    • 2013
  • In this paper, we propose a hand tracking and gesture recognition system. Our system employs a depth capture device to obtain 3D geometric information of user's bare hand. In particular, we build a flexible tracking volume and restrict the hand tracking area, so that we can avoid diverse problems caused by conventional object detection/tracking systems. The proposed system computes running average of the hand position, and tracking volume is actively adjusted according to the statistical information that is computed on the basis of uncertainty of the user's hand motion in the 3D space. Once the position of user's hand is obtained, then the system attempts to detect stretched fingers to recognize finger gesture of the user's hand. In order to test the proposed framework, we built a NUI system using the proposed technique, and verified that our system presents very stable performance even in the case that multiple objects exist simultaneously in the crowded environment, as well as in the situation that the scene is occluded temporarily. We also verified that our system ensures running speed of 24-30 frames per second throughout the experiments.

Face and Hand Tracking Algorithm for Sign Language Recognition (수화 인식을 위한 얼굴과 손 추적 알고리즘)

  • Park, Ho-Sik;Bae, Cheol-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.11C
    • /
    • pp.1071-1076
    • /
    • 2006
  • In this paper, we develop face and hand tracking for sign language recognition system. The system is divided into two stages; the initial and tracking stages. In initial stage, we use the skin feature to localize face and hands of signer. The ellipse model on CbCr space is constructed and used to detect skin color. After the skin regions have been segmented, face and hand blobs are defined by using size and facial feature with the assumption that the movement of face is less than that of hands in this signing scenario. In tracking stage, the motion estimation is applied only hand blobs, in which first and second derivative are used to compute the position of prediction of hands. We observed that there are errors in the value of tracking position between two consecutive frames in which velocity has changed abruptly. To improve the tracking performance, our proposed algorithm compensates the error of tracking position by using adaptive search area to re-compute the hand blobs. The experimental results indicate that our proposed method is able to decrease the prediction error up to 96.87% with negligible increase in computational complexity of up to 4%.

Hand posture recognition robust to rotation using temporal correlation between adjacent frames (인접 프레임의 시간적 상관 관계를 이용한 회전에 강인한 손 모양 인식)

  • Lee, Seong-Il;Min, Hyun-Seok;Shin, Ho-Chul;Lim, Eul-Gyoon;Hwang, Dae-Hwan;Ro, Yong-Man
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.11
    • /
    • pp.1630-1642
    • /
    • 2010
  • Recently, there is an increasing need for developing the technique of Hand Gesture Recognition (HGR), for vision based interface. Since hand gesture is defined as consecutive change of hand posture, developing the algorithm of Hand Posture Recognition (HPR) is required. Among the factors that decrease the performance of HPR, we focus on rotation factor. To achieve rotation invariant HPR, we propose a method that uses the property of video that adjacent frames in video have high correlation, considering the environment of HGR. The proposed method introduces template update of object tracking using the above mentioned property, which is different from previous works based on still images. To compare our proposed method with previous methods such as template matching, PCA and LBP, we performed experiments with video that has hand rotation. The accuracy rate of the proposed method is 22.7%, 14.5%, 10.7% and 4.3% higher than ordinary template matching, template matching using KL-Transform, PCA and LBP, respectively.

Skin Color Based Hand and Finger Detection for Gesture Recognition in CCTV Surveillance (CCTV 관제에서 동작 인식을 위한 색상 기반 손과 손가락 탐지)

  • Kang, Sung-Kwan;Chung, Kyung-Yong;Rim, Kee-Wook;Lee, Jung-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.10
    • /
    • pp.1-10
    • /
    • 2011
  • In this paper, we proposed the skin color based hand and finger detection technology for the gesture recognition in CCTV surveillance. The aim of this paper is to present the methodology for hand detection and propose the finger detection method. The detected hand and finger can be used to implement the non-contact mouse. This technology can be used to control the home devices such as home-theater and television. Skin color is used to segment the hand region from background and contour is extracted from the segmented hand. Analysis of contour gives us the location of finger tip in the hand. After detecting the location of the fingertip, this system tracks the fingertip by using only R channel alone, and in recognition of hand motions to apply differential image, such as the removal of useless image shows a robust side. We explain about experiment which relates in fingertip tracking and finger gestures recognition, and experiment result shows the accuracy above 96%.

A Design and Implementation of Monitor Control Scheme using Kinect Sensor (Kinect Sensor를 활용한 모니터 제어 기법 설계 및 구현)

  • Lee, Won-Joo;Lee, Hyun Jin;Chu, Ji Hyun;Kim, Young Suk;Kim, Do Young;So, Jin Su
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2014.07a
    • /
    • pp.33-34
    • /
    • 2014
  • 본 논문에서는 Kinect 기반의 모니터 제어 기법을 설계하고 구현한다. 이 기법은 Kinect의 Joint 기능을 활용하여 신체의 구조를 인식하고 손과 머리의 Joint를 받아온다. Kinect의 카메라는 손 동작을 추적하여 Joint 신체 부위에 해당하는 Joint 값을 모니터의 좌표 값으로 변환함으로써 손 동작에 따라 마우스를 움직이는 기능을 구현한다. 그리고 파워포인트에서 사용하는 F5, Esc, Right, Left 키에 대하여 Joint의 좌표 값이 일정 범위를 초과하면 마우스 클릭 이벤트를 발생하도록 구현함으로써 파워포인트의 슬라이드 화면을 제어하는 기능을 구현한다.

  • PDF

Hand Gesture Interface Using Mobile Camera Devices (모바일 카메라 기기를 이용한 손 제스처 인터페이스)

  • Lee, Chan-Su;Chun, Sung-Yong;Sohn, Myoung-Gyu;Lee, Sang-Heon
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.5
    • /
    • pp.621-625
    • /
    • 2010
  • This paper presents a hand motion tracking method for hand gesture interface using a camera in mobile devices such as a smart phone and PDA. When a camera moves according to the hand gesture of the user, global optical flows are generated. Therefore, robust hand movement estimation is possible by considering dominant optical flow based on histogram analysis of the motion direction. A continuous hand gesture is segmented into unit gestures by motion state estimation using motion phase, which is determined by velocity and acceleration of the estimated hand motion. Feature vectors are extracted during movement states and hand gestures are recognized at the end state of each gesture. Support vector machine (SVM), k-nearest neighborhood classifier, and normal Bayes classifier are used for classification. SVM shows 82% recognition rate for 14 hand gestures.

Leap-Motion Based Tracking Framework for Practice and Analysis of User's Arm Muscle (확장현실에서 사용자의 팔 근육 연습 및 분석을 위한 립모션 기반 추적 프레임워크)

  • Park, Seonga;Park, Soyeon;Kim, Jong-Hyun
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2020.07a
    • /
    • pp.469-472
    • /
    • 2020
  • 본 논문에서는 립모션 디바이스를 이용하여 손의 움직임을 계산하고 이로부터 저글링 운동 동작뿐 만 아니라 이것을 이용한 팔 근육을 연습할 수 있는 새로운 프레임워크를 제안한다. 제안된 방법은 실시간으로 동작하기 때문에 사용자의 동작에 맞춰진 분석을 할 수 있다. 본 논문의 프레임워크는 크게 세 부분으로 나누어진다. 우선, 1) 사용자가 공을 튕기는 이벤트 트리거를 손목 움직임으로부터 정의한 뒤, 2) 사용자의 손 위치를 기준으로 저글링 형태의 움직임을 공에 매핑시키기 위한 포물선 기반 입자 기법을 제안한다. 마지막으로, 3) 손목의 굽힘을 기반으로 근육의 활동 양을 시각화할 수 있는 기법을 제안한다. 결과적으로 본 논문의 프레임워크를 이용하면 실시간 저글링 게임을 할 수 있을 뿐만 아니라 사용자의 팔 근육 움직임을 실시간으로 분석할 수 있다.

  • PDF

Development of Motion Recognition Platform Using Smart-Phone Tracking and Color Communication (스마트 폰 추적 및 색상 통신을 이용한 동작인식 플랫폼 개발)

  • Oh, Byung-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.5
    • /
    • pp.143-150
    • /
    • 2017
  • In this paper, we propose a novel motion recognition platform using smart-phone tracking and color communication. The interface requires only a camera and a personal smart-phone to provide a motion control interface rather than expensive equipment. The platform recognizes the user's gestures by the tracking 3D distance and the rotation angle of the smart-phone, which acts essentially as a motion controller in the user's hand. Also, a color coded communication method using RGB color combinations is included within the interface. Users can conveniently send or receive any text data through this function, and the data can be transferred continuously even while the user is performing gestures. We present the result that implementation of viable contents based on the proposed motion recognition platform.

Real-time Motion Generation of Virtual Character using the Stereo Images (스테레오 영상을 이용한 가상 캐릭터의 실시간 동작 생성)

  • Lee, Ran-Hee;Kim, Sung-En;Park, Chang-Jun;Lee, In-Ho
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2002.11a
    • /
    • pp.31-34
    • /
    • 2002
  • 본 논문에서는 2 대의 카메라로부터 입력된 스테레오 영상을 이용하여 가상캐릭터의 동작을 실시간으로 생성하는 방법에 대해 기술한다. 이 시스템은 동작자의 움직임을 캡쳐하기 위해 동작자의 좌, 우 전방에 동기화된 2 대의 컬러 CCD 카메라를 위치시킨다. 카메라로부터 입력된 스테레오 영상을 분석하여 신체의 중심이 되는 루트와 머리, 손, 발과 같은 end-effector의 2차원 특징점을 추출하고, 추출된 특징점들은 카메라의 사영행렬과 추적 알고리즘을 통해 3차원 위치를 생성한다. 생성된 루트와 end-effector 의 3 차원 위치정보는 노이즈 제거를 위한 필터링을 거친 후 역운동학 알고리듬에 적용하고, 인체 관절의 해부학적인 제약조건과 관절간의 상호 연관성 및 전 후 프레임간의 부드러운 연결 동작 생성을 고려하여 중간관절의 위치를 정밀하게 계산한다. 중간관절의 위치를 생성하므로 서 임의 동작자의 움직임에 대한 모든 관절의 정보를 획득할 수 있으며, 획득된 동작 데이터를 가상 캐릭터에 적용하므로 서 캐릭터의 움직임을 실시간으로 생성할 수 있다.

  • PDF

Hand Gesture Recognition Algorithm Robust to Complex Image (복잡한 영상에 강인한 손동작 인식 방법)

  • Park, Sang-Yun;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.7
    • /
    • pp.1000-1015
    • /
    • 2010
  • In this paper, we propose a novel algorithm for hand gesture recognition. The hand detection method is based on human skin color, and we use the boundary energy information to locate the hand region accurately, then the moment method will be employed to locate the hand palm center. Hand gesture recognition can be separated into 2 step: firstly, the hand posture recognition: we employ the parallel NNs to deal with problem of hand posture recognition, pattern of a hand posture can be extracted by utilize the fitting ellipses method, which separates the detected hand region by 12 ellipses and calculates the white pixels rate in ellipse line. the pattern will be input to the NNs with 12 input nodes, the NNs contains 4 output nodes, each output node out a value within 0~1, the posture is then represented by composed of the 4 output codes. Secondly, the hand gesture tracking and recognition: we employed the Kalman filter to predict the position information of gesture to create the position sequence, distance relationship between positions will be used to confirm the gesture. The simulation have been performed on Windows XP to evaluate the efficiency of the algorithm, for recognizing the hand posture, we used 300 training images to train the recognizing machine and used 200 images to test the machine, the correct number is up to 194. And for testing the hand tracking recognition part, we make 1200 times gesture (each gesture 400 times), the total correct number is 1002 times. These results shows that the proposed gesture recognition algorithm can achieve an endurable job for detecting the hand and its' gesture.