• Title/Summary/Keyword: Gesture Recognition.

Search Result 558, Processing Time 0.028 seconds

(A Comparison of Gesture Recognition Performance Based on Feature Spaces of Angle, Velocity and Location in HMM Model) (HMM인식기 상에서 방향, 속도 및 공간 특징량에 따른 제스처 인식 성능 비교)

  • 윤호섭;양현승
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.5_6
    • /
    • pp.430-443
    • /
    • 2003
  • The objective of this paper is to evaluate most useful feature vector space using the angle, velocity and location features from gesture trajectory which extracted hand regions from consecutive input images and track them by connecting their positions. For this purpose, the gesture tracking algorithm using color and motion information is developed. The recognition module is a HMM model to adaptive time various data. The proposed algorithm was applied to a database containing 4,800 alphabetical handwriting gestures of 20 persons who was asked to draw his/her handwriting gestures five times for each of the 48 characters.

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

A method for image-based shadow interaction with virtual objects

  • Ha, Hyunwoo;Ko, Kwanghee
    • Journal of Computational Design and Engineering
    • /
    • v.2 no.1
    • /
    • pp.26-37
    • /
    • 2015
  • A lot of researchers have been investigating interactive portable projection systems such as a mini-projector. In addition, in exhibition halls and museums, there is a trend toward using interactive projection systems to make viewing more exciting and impressive. They can also be applied in the field of art, for example, in creating shadow plays. The key idea of the interactive portable projection systems is to recognize the user's gesture in real-time. In this paper, a vision-based shadow gesture recognition method is proposed for interactive projection systems. The gesture recognition method is based on the screen image obtained by a single web camera. The method separates only the shadow area by combining the binary image with an input image using a learning algorithm that isolates the background from the input image. The region of interest is recognized with labeling the shadow of separated regions, and then hand shadows are isolated using the defect, convex hull, and moment of each region. To distinguish hand gestures, Hu's invariant moment method is used. An optical flow algorithm is used for tracking the fingertip. Using this method, a few interactive applications are developed, which are presented in this paper.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

Vision-based hand Gesture Detection and Tracking System (비전 기반의 손동작 검출 및 추적 시스템)

  • Park Ho-Sik;Bae Cheol-soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1175-1180
    • /
    • 2005
  • We present a vision-based hand gesture detection and tracking system. Most conventional hand gesture recognition systems utilize a simpler method for hand detection such as background subtractions with assumed static observation conditions and those methods are not robust against camera motions, illumination changes, and so on. Therefore, we propose a statistical method to recognize and detect hand regions in images using geometrical structures. Also, Our hand tracking system employs multiple cameras to reduce occlusion problems and non-synchronous multiple observations enhance system scalability. In this experiment, the proposed method has recognition rate of $99.28\%$ that shows more improved $3.91\%$ than the conventional appearance method.

Hand Gesture Recognition Using HMM(Hidden Markov Model) (HMM(Hidden Markov Model)을 이용한 핸드 제스처인식)

  • Ha, Jeong-Yo;Lee, Min-Ho;Choi, Hyung-Il
    • Journal of Digital Contents Society
    • /
    • v.10 no.2
    • /
    • pp.291-298
    • /
    • 2009
  • In this paper we proposed a vision based realtime hand gesture recognition method. To extract skin color, we translate RGB color space into YCbCr color space and use CbCr color for the final extraction. To find the center of extracted hand region we apply practical center point extraction algorithm. We use Kalman filter to tracking hand region and use HMM(Hidden Markov Model) algorithm (learning 6 type of hand gesture image) to recognize it. We demonstrated the effectiveness of our algorithm by some experiments.

  • PDF

Finger Directivity Recognition Algorithm using Shape Decomposition (형상분해를 이용한 손가락 방향성 인식 알고리즘)

  • Choi, Jong-Ho
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.3
    • /
    • pp.197-201
    • /
    • 2011
  • The use of gestures provides an attractive alternate to cumbersome interfaces for human-computer devices interaction. This has motivated a very active research area concerned with computer vision-based recognition of hand gestures. The most important issues in hand gesture recognition is to recognize the directivity of finger. The primitive elements extracted to a hand gesture include in very important information on the directivity of finger. In this paper, we propose the recognition algorithm of finger directivity by using the cross points of circle and sub-primitive element. The radius of circle is increased from minimum radius including main-primitive element to it including sub-primitive elements. Through the experiment, we demonstrated the efficiency of proposed algorithm.

Kinect-based Motion Recognition Model for the 3D Contents Control (3D 콘텐츠 제어를 위한 키넥트 기반의 동작 인식 모델)

  • Choi, Han Suk
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.1
    • /
    • pp.24-29
    • /
    • 2014
  • This paper proposes a kinect-based human motion recognition model for the 3D contents control after tracking the human body gesture through the camera in the infrared kinect project. The proposed human motion model in this paper computes the distance variation of the body movement from shoulder to right and left hand, wrist, arm, and elbow. The human motion model is classified into the movement directions such as the left movement, right movement, up, down, enlargement, downsizing. and selection. The proposed kinect-based human motion recognition model is very natural and low cost compared to other contact type gesture recognition technologies and device based gesture technologies with the expensive hardware system.

Stroke Based Hand Gesture Recognition by Analyzing a Trajectory of Polhemus Sensor (Polhemus 센서의 궤적 정보 해석을 이용한 스트로크 기반의 손 제스처 인식)

  • Kim, In-Cheol;Lee, Nam-Ho;Lee, Yong-Bum;Chien, Sung-Il
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.8
    • /
    • pp.46-53
    • /
    • 1999
  • We have developed glove based hand gesture recognition system for recognizing 3D gesture of operators in remote work environment. Polhemus sensor attached to the PinchGlove is employed to obtain the sequence of 3D positions of a hand trajectory. These 3D data are then encoded as the input to our recognition system. We propose the use of the strokes to be modeled by HMMs as basic units. The gesture models are constructed by concatenating stroke HMMs and thereby the HMMs for the newly defined gestures can be created without retraining their parameters. Thus, by using stroke models rather than gesture models, we can raise the system extensibility. The experiment results for 16 different gestures show that our stroke based composite HMM performs better than the conventional gesture based HMM.

  • PDF

Hand gesture based a pet robot control (손 제스처 기반의 애완용 로봇 제어)

  • Park, Se-Hyun;Kim, Tae-Ui;Kwon, Kyung-Su
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.13 no.4
    • /
    • pp.145-154
    • /
    • 2008
  • In this paper, we propose the pet robot control system using hand gesture recognition in image sequences acquired from a camera affixed to the pet robot. The proposed system consists of 4 steps; hand detection, feature extraction, gesture recognition and robot control. The hand region is first detected from the input images using the skin color model in HSI color space and connected component analysis. Next, the hand shape and motion features from the image sequences are extracted. Then we consider the hand shape for classification of meaning gestures. Thereafter the hand gesture is recognized by using HMMs (hidden markov models) which have the input as the quantized symbol sequence by the hand motion. Finally the pet robot is controlled by a order corresponding to the recognized hand gesture. We defined four commands of sit down, stand up, lie flat and shake hands for control of pet robot. And we show that user is able to control of pet robot through proposed system in the experiment.

  • PDF