• Title/Summary/Keyword: Hand gesture recognition

Search Result 311, Processing Time 0.024 seconds

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

Interactive visual knowledge acquisition for hand-gesture recognition (손 제스쳐 인식을 위한 상호작용 시각정보 추출)

  • 양선옥;최형일
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.9
    • /
    • pp.88-96
    • /
    • 1996
  • Computer vision-based gesture recognition systems consist of image segmentation, object tracking and decision. However, it is difficult to segment an object from image for gesture in computer systems because of vaious illuminations and backgrounds. In this paper, we describe a method to learn features for segmentation, which improves the performance of computer vision-based hand-gesture recognition systems. Systems interact with a user to acquire exact training data and segment information according to a predefined plan. System provides some models to the user, takes pictures of the user's response and then analyzes the pictures with models and a prior knowledge. The system sends messages to the user and operates learning module to extract information with the analyzed result.

  • PDF

Robot User Control System using Hand Gesture Recognizer (수신호 인식기를 이용한 로봇 사용자 제어 시스템)

  • Shon, Su-Won;Beh, Joung-Hoon;Yang, Cheol-Jong;Wang, Han;Ko, Han-Seok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.4
    • /
    • pp.368-374
    • /
    • 2011
  • This paper proposes a robot control human interface using Markov model (HMM) based hand signal recognizer. The command receiving humanoid robot sends webcam images to a client computer. The client computer then extracts the intended commanding hum n's hand motion descriptors. Upon the feature acquisition, the hand signal recognizer carries out the recognition procedure. The recognition result is then sent back to the robot for responsive actions. The system performance is evaluated by measuring the recognition of '48 hand signal set' which is created randomly using fundamental hand motion set. For isolated motion recognition, '48 hand signal set' shows 97.07% recognition rate while the 'baseline hand signal set' shows 92.4%. This result validates the proposed hand signal recognizer is indeed highly discernable. For the '48 hand signal set' connected motions, it shows 97.37% recognition rate. The relevant experiments demonstrate that the proposed system is promising for real world human-robot interface application.

A Study on the VR Payment System using Hand Gesture Recognition (손 제스쳐 인식을 활용한 VR 결제 시스템 연구)

  • Kim, Kyoung Hwan;Lee, Won Hyung
    • Journal of the Korean Society for Computer Game
    • /
    • v.31 no.4
    • /
    • pp.129-135
    • /
    • 2018
  • Electronic signatures, QR codes, and bar codes are used in payment systems used in real life. Research has begun on the payment system implemented in the VR environment. This paper proposes a VR electronic sign system that uses hand gesture recognition to implement an existing payment system in a VR environment. In a VR system, you can not hit the keyboard or touch the mouse. There can be several ways to configure a payment system with a VR controller. Electronic signage using hand gesture recognition is one of them, and hand gesture recognition can be classified by the Warping Methods, Statistical Methods, and Template Matching methods. In this paper, the payment system was configured in VR using the $p algorithm belonging to the Template Matching method. To create a VR environment, we implemented a paypal system where actual payment is made using Unity3D and Vive equipment.

Finger Directivity Recognition Algorithm using Shape Decomposition (형상분해를 이용한 손가락 방향성 인식 알고리즘)

  • Choi, Jong-Ho
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.3
    • /
    • pp.197-201
    • /
    • 2011
  • The use of gestures provides an attractive alternate to cumbersome interfaces for human-computer devices interaction. This has motivated a very active research area concerned with computer vision-based recognition of hand gestures. The most important issues in hand gesture recognition is to recognize the directivity of finger. The primitive elements extracted to a hand gesture include in very important information on the directivity of finger. In this paper, we propose the recognition algorithm of finger directivity by using the cross points of circle and sub-primitive element. The radius of circle is increased from minimum radius including main-primitive element to it including sub-primitive elements. Through the experiment, we demonstrated the efficiency of proposed algorithm.

Fast Hand-Gesture Recognition Algorithm For Embedded System (임베디드 시스템을 위한 고속의 손동작 인식 알고리즘)

  • Hwang, Dong-Hyun;Jang, Kyung-Sik
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.21 no.7
    • /
    • pp.1349-1354
    • /
    • 2017
  • In this paper, we propose a fast hand-gesture recognition algorithm for embedded system. Existing hand-gesture recognition algorithm has a difficulty to use in a low performance system such as embedded systems and mobile devices because of high computational complexity of contour tracing method that extracts all points of hand contour. Instead of using algorithms based on contour tracing, the proposed algorithm uses concentric-circle tracing method to estimate the abstracted contour of fingers, then classify hand-gestures by extracting features. The proposed algorithm has an average recognition rate of 95% and an average execution time of 1.29ms, which shows a maximum performance improvement of 44% compared with algorithm using the existing contour tracing method. It is confirmed that the algorithm can be used in a low performance system such as embedded systems and mobile devices.

Hybrid HMM for Transitional Gesture Classification in Thai Sign Language Translation

  • Jaruwanawat, Arunee;Chotikakamthorn, Nopporn;Werapan, Worawit
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1106-1110
    • /
    • 2004
  • A human sign language is generally composed of both static and dynamic gestures. Each gesture is represented by a hand shape, its position, and hand movement (for a dynamic gesture). One of the problems found in automated sign language translation is on segmenting a hand movement that is part of a transitional movement from one hand gesture to another. This transitional gesture conveys no meaning, but serves as a connecting period between two consecutive gestures. Based on the observation that many dynamic gestures as appeared in Thai sign language dictionary are of quasi-periodic nature, a method was developed to differentiate between a (meaningful) dynamic gesture and a transitional movement. However, there are some meaningful dynamic gestures that are of non-periodic nature. Those gestures cannot be distinguished from a transitional movement by using the signal quasi-periodicity. This paper proposes a hybrid method using a combination of the periodicity-based gesture segmentation method with a HMM-based gesture classifier. The HMM classifier is used here to detect dynamic signs of non-periodic nature. Combined with the periodic-based gesture segmentation method, this hybrid scheme can be used to identify segments of a transitional movement. In addition, due to the use of quasi-periodic nature of many dynamic sign gestures, dimensionality of the HMM part of the proposed method is significantly reduced, resulting in computational saving as compared with a standard HMM-based method. Through experiment with real measurement, the proposed method's recognition performance is reported.

  • PDF

Gesture Recognition by Analyzing a Trajetory on Spatio-Temporal Space (시공간상의 궤적 분석에 의한 제스쳐 인식)

  • 민병우;윤호섭;소정;에지마 도시야끼
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.1
    • /
    • pp.157-157
    • /
    • 1999
  • Researches on the gesture recognition have become a very interesting topic in the computer vision area, Gesture recognition from visual images has a number of potential applicationssuch as HCI (Human Computer Interaction), VR(Virtual Reality), machine vision. To overcome thetechnical barriers in visual processing, conventional approaches have employed cumbersome devicessuch as datagloves or color marked gloves. In this research, we capture gesture images without usingexternal devices and generate a gesture trajectery composed of point-tokens. The trajectory Is spottedusing phase-based velocity constraints and recognized using the discrete left-right HMM. Inputvectors to the HMM are obtained by using the LBG clustering algorithm on a polar-coordinate spacewhere point-tokens on the Cartesian space .are converted. A gesture vocabulary is composed oftwenty-two dynamic hand gestures for editing drawing elements. In our experiment, one hundred dataper gesture are collected from twenty persons, Fifty data are used for training and another fifty datafor recognition experiment. The recognition result shows about 95% recognition rate and also thepossibility that these results can be applied to several potential systems operated by gestures. Thedeveloped system is running in real time for editing basic graphic primitives in the hardwareenvironments of a Pentium-pro (200 MHz), a Matrox Meteor graphic board and a CCD camera, anda Window95 and Visual C++ software environment.

Hand Movement Tracking and Recognizing Hand Gestures (핸드 제스처를 인식하는 손동작 추적)

  • Park, Kwang-Chae;Bae, Ceol-Soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.8
    • /
    • pp.3971-3975
    • /
    • 2013
  • This paper introduces an Augmented Reality system recognizing hand gestures and shows results of the evaluation. The system's user can interact with artificial objects and manipulate their position and motions simply by his hand gestures. Hand gesture recognition is based on Histograms of Oriented Gradients (HOG). Salient features of human hand appearance are detected by HOG blocks. Blocks of different sizes are tested to define the most suitable configuration. To select the most informative blocks for classification multiclass AdaBoostSVM algorithm is applied. Evaluated recognition rate of the algorithm is 94.0%.

Remote Control System using Face and Gesture Recognition based on Deep Learning (딥러닝 기반의 얼굴과 제스처 인식을 활용한 원격 제어)

  • Hwang, Kitae;Lee, Jae-Moon;Jung, Inhwan
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.6
    • /
    • pp.115-121
    • /
    • 2020
  • With the spread of IoT technology, various IoT applications using facial recognition are emerging. This paper describes the design and implementation of a remote control system using deep learning-based face recognition and hand gesture recognition. In general, an application system using face recognition consists of a part that takes an image in real time from a camera, a part that recognizes a face from the image, and a part that utilizes the recognized result. Raspberry PI, a single board computer that can be mounted anywhere, has been used to shoot images in real time, and face recognition software has been developed using tensorflow's FaceNet model for server computers and hand gesture recognition software using OpenCV. We classified users into three groups: Known users, Danger users, and Unknown users, and designed and implemented an application that opens automatic door locks only for Known users who have passed both face recognition and hand gestures.