• Title/Summary/Keyword: Human Gesture Recognition

Search Result 197, Processing Time 0.022 seconds

A Study on Hand-signal Recognition System in 37dimensional Space (3차원 공간상의 수신호 인식 시스템에 대한 연구)

  • 장효영;김대진;김정배;변증남
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.215-218
    • /
    • 2002
  • Gesture recognitions needed for various applications and is now gaining in importance as one method of enabling natural and intuitive human machine communication. In this paper, we propose a real time hand-signal recognition system in 3-dimensional space performs robust, real-time tracking under varying illumination. As compared with the existing method using classical pattern matching, this system is efficient with respect to speed and also presents more systematic way of defining hand-signals and developing a hand-signal recognition system. In order to verify the proposed method, we developed a virtual driving system operated by hand-signals.

  • PDF

A Study on Tactile and Gestural Controls of Driver Interfaces for In-Vehicle Systems (차량내 시스템에 대한 접촉 및 제스처 방식의 운전자 인터페이스에 관한 연구)

  • Shim, Ji-Sung;Lee, Sang Hun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.42-50
    • /
    • 2016
  • Traditional tactile controls that include push buttons and rotary switches may cause significant visual and biomechanical distractions if they are located away from the driver's line of sight and hand position, for example, on the central console. Gestural controls, as an alternative to traditional controls, are natural and can reduce visual distractions; however, their types and numbers are limited and have no feedback. To overcome the problems, a driver interface combining gestures and visual feedback with a head-up display has been proposed recently. In this paper, we investigated the effect of this type of interface in terms of driving performance measures. Human-in-the-loop experiments were conducted using a driving simulator with the traditional tactile and the new gesture-based interfaces. The experimental results showed that the new interface caused less visual distractions, better gap control between ego and target vehicles, and better recognition of road conditions comparing to the traditional one.

Hand Language Translation Using Kinect

  • Pyo, Junghwan;Kang, Namhyuk;Bang, Jiwon;Jeong, Yongjin
    • Journal of IKEEE
    • /
    • v.18 no.2
    • /
    • pp.291-297
    • /
    • 2014
  • Since hand gesture recognition was realized thanks to improved image processing algorithms, sign language translation has been a critical issue for the hearing-impaired. In this paper, we extract human hand figures from a real time image stream and detect gestures in order to figure out which kind of hand language it means. We used depth-color calibrated image from the Kinect to extract human hands and made a decision tree in order to recognize the hand gesture. The decision tree contains information such as number of fingers, contours, and the hand's position inside a uniform sized image. We succeeded in recognizing 'Hangul', the Korean alphabet, with a recognizing rate of 98.16%. The average execution time per letter of the system was about 76.5msec, a reasonable speed considering hand language translation is based on almost still images. We expect that this research will help communication between the hearing-impaired and other people who don't know hand language.

Hand posture recognition robust to rotation using temporal correlation between adjacent frames (인접 프레임의 시간적 상관 관계를 이용한 회전에 강인한 손 모양 인식)

  • Lee, Seong-Il;Min, Hyun-Seok;Shin, Ho-Chul;Lim, Eul-Gyoon;Hwang, Dae-Hwan;Ro, Yong-Man
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.11
    • /
    • pp.1630-1642
    • /
    • 2010
  • Recently, there is an increasing need for developing the technique of Hand Gesture Recognition (HGR), for vision based interface. Since hand gesture is defined as consecutive change of hand posture, developing the algorithm of Hand Posture Recognition (HPR) is required. Among the factors that decrease the performance of HPR, we focus on rotation factor. To achieve rotation invariant HPR, we propose a method that uses the property of video that adjacent frames in video have high correlation, considering the environment of HGR. The proposed method introduces template update of object tracking using the above mentioned property, which is different from previous works based on still images. To compare our proposed method with previous methods such as template matching, PCA and LBP, we performed experiments with video that has hand rotation. The accuracy rate of the proposed method is 22.7%, 14.5%, 10.7% and 4.3% higher than ordinary template matching, template matching using KL-Transform, PCA and LBP, respectively.

A Hand Gesture Recognition System using 3D Tracking Volume Restriction Technique (3차원 추적영역 제한 기법을 이용한 손 동작 인식 시스템)

  • Kim, Kyung-Ho;Jung, Da-Un;Lee, Seok-Han;Choi, Jong-Soo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.201-211
    • /
    • 2013
  • In this paper, we propose a hand tracking and gesture recognition system. Our system employs a depth capture device to obtain 3D geometric information of user's bare hand. In particular, we build a flexible tracking volume and restrict the hand tracking area, so that we can avoid diverse problems caused by conventional object detection/tracking systems. The proposed system computes running average of the hand position, and tracking volume is actively adjusted according to the statistical information that is computed on the basis of uncertainty of the user's hand motion in the 3D space. Once the position of user's hand is obtained, then the system attempts to detect stretched fingers to recognize finger gesture of the user's hand. In order to test the proposed framework, we built a NUI system using the proposed technique, and verified that our system presents very stable performance even in the case that multiple objects exist simultaneously in the crowded environment, as well as in the situation that the scene is occluded temporarily. We also verified that our system ensures running speed of 24-30 frames per second throughout the experiments.

Virtual Environment Interfacing based on State Automata and Elementary Classifiers (상태 오토마타와 기본 요소분류기를 이용한 가상현실용 실시간 인터페이싱)

  • Kim, Jong-Sung;Lee, Chan-Su;Song, Kyung-Joon;Min, Byung-Eui;Park, Chee-Hang
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.12
    • /
    • pp.3033-3044
    • /
    • 1997
  • This paper presents a system which recognizes dynamic hand gesture for virtual reality (VR). A dynamic hand gesture is a method of communication for human and computer who uses gestures, especially both hands and fingers. Since the human hands and fingers are not the same in physical dimension, the produced by two persons with their hands may not have the same numerical values where obtained through electronic sensors. To recognize meaningful gesture from continuous gestures which have no token of beginning and end, this system segments current motion states using the state automata. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line pattern recognition.

  • PDF

Control of Ubiquitous Environment using Sensors Module (센서모듈을 이용한 유비쿼터스 환경의 제어)

  • Jung, Tae-Min;Choi, Woo-Kyung;Kim, Seong-Joo;Jeon, Hong-Tae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.2
    • /
    • pp.190-195
    • /
    • 2007
  • As Ubiquitous era comes, it became necessary to construct environment which can provide more useful information to human in the spaces where people live like homes or offices. On this account, network of the peripheral devices of Ubiquitous should constitute efficiently. For it, this paper researched human pattern by classified motion recognition using sensors module data. (This data processing by Neural network and fuzzy algorithm.) This pattern classification can help control home network system communication. I suggest the system which can control home network system more easily through patterned movement, and control Ubiquitous environment by grasp human's movement and condition.

Human Head Mouse System Based on Facial Gesture Recognition

  • Wei, Li;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.10 no.12
    • /
    • pp.1591-1600
    • /
    • 2007
  • Camera position information from 2D face image is very important for that make the virtual 3D face model synchronize to the real face at view point, and it is also very important for any other uses such as: human computer interface (face mouth), automatic camera control etc. We present an algorithm to detect human face region and mouth, based on special color features of face and mouth in $YC_bC_r$ color space. The algorithm constructs a mouth feature image based on $C_b\;and\;C_r$ values, and use pattern method to detect the mouth position. And then we use the geometrical relationship between mouth position information and face side boundary information to determine the camera position. Experimental results demonstrate the validity of the proposed algorithm and the Correct Determination Rate is accredited for applying it into practice.

  • PDF

Cognitive and Emotional Structure of a Robotic Game Player in Turn-based Interaction

  • Yang, Jeong-Yean
    • International journal of advanced smart convergence
    • /
    • v.4 no.2
    • /
    • pp.154-162
    • /
    • 2015
  • This paper focuses on how cognitive and emotional structures affect humans during long-term interaction. We design an interaction with a turn-based game, the Chopstick Game, in which two agents play with numbers using their fingers. While a human and a robot agent alternate turn, the human user applies herself to play the game and to learn new winning skills from the robot agent. Conventional valence and arousal space is applied to design emotional interaction. For the robotic system, we implement finger gesture recognition and emotional behaviors that are designed for three-dimensional virtual robot. In the experimental tests, the properness of the proposed schemes is verified and the effect of the emotional interaction is discussed.

Recognition of Hand gesture to Human-Computer Interaction (손동작 인식을 통한 Human-Computer Interaction 구현)

  • 이래경;김성신
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.1
    • /
    • pp.28-32
    • /
    • 2001
  • 인간의 손동작 인식은 오랫동안 언어로서의 역할을 해왔던 통신수단의 한 방법이다. 현대의 사회가 정보화 사회로 진행됨에 따라 보다 빠르고 정확한 의사소통 및 정보의 전달을 필요로 하는 가운데 사람과 컴퓨터간의 상호 연결 혹은 사람의 의사 표현에 있어 기존의 장치들이 가지는 단점을 보안하며 이 부분에 사람의 두 손으로 표현되는 자유로운 몸짓을 이용하려는 연구가 최근에 많이 진행되고 있는 추세이다. 본 논문에선 2차원 입력 영상으로부터 동적인 손동작의 사용 없이 손의 특징을 이용한 새로운 인식 알고리즘을 제안하고, 보다 높은 인식률과 실 시간적 처리를 위해 Radial Basis Function Network 및 부가적인 특징점을 통한 손동작의 인식을 구현하였다. 또한 인식된 손동작의 의미를 바탕으로 인식률 및 손동작 표현의 의미성에 대한 정확도를 판별하기 위해 로봇의 제어에 적용한 실험을 수행하였다.

  • PDF