• Title/Summary/Keyword: hand gesture recognition

Search Result 311, Processing Time 0.027 seconds

Design of Image Extraction Hardware for Hand Gesture Vision Recognition

  • Lee, Chang-Yong;Kwon, So-Young;Kim, Young-Hyung;Lee, Yong-Hwan
    • Journal of Advanced Information Technology and Convergence
    • /
    • v.10 no.1
    • /
    • pp.71-83
    • /
    • 2020
  • In this paper, we propose a system that can detect the shape of a hand at high speed using an FPGA. The hand-shape detection system is designed using Verilog HDL, a hardware language that can process in parallel instead of sequentially running C++ because real-time processing is important. There are several methods for hand gesture recognition, but the image processing method is used. Since the human eye is sensitive to brightness, the YCbCr color model was selected among various color expression methods to obtain a result that is less affected by lighting. For the CbCr elements, only the components corresponding to the skin color are filtered out from the input image by utilizing the restriction conditions. In order to increase the speed of object recognition, a median filter that removes noise present in the input image is used, and this filter is designed to allow comparison of values and extraction of intermediate values at the same time to reduce the amount of computation. For parallel processing, it is designed to locate the centerline of the hand during scanning and sorting the stored data. The line with the highest count is selected as the center line of the hand, and the size of the hand is determined based on the count, and the hand and arm parts are separated. The designed hardware circuit satisfied the target operating frequency and the number of gates.

An Extraction Method of Meaningful Hand Gesture for a Robot Control (로봇 제어를 위한 의미 있는 손동작 추출 방법)

  • Kim, Aram;Rhee, Sang-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.27 no.2
    • /
    • pp.126-131
    • /
    • 2017
  • In this paper, we propose a method to extract meaningful motion among various kinds of hand gestures on giving commands to robots using hand gestures. On giving a command to the robot, the hand gestures of people can be divided into a preparation one, a main one, and a finishing one. The main motion is a meaningful one for transmitting a command to the robot in this process, and the other operation is a meaningless auxiliary operation to do the main motion. Therefore, it is necessary to extract only the main motion from the continuous hand gestures. In addition, people can move their hands unconsciously. These actions must also be judged by the robot with meaningless ones. In this study, we extract human skeleton data from a depth image obtained by using a Kinect v2 sensor and extract location data of hands data from them. By using the Kalman filter, we track the location of the hand and distinguish whether hand motion is meaningful or meaningless to recognize the hand gesture by using the hidden markov model.

A Controlled Study of Interactive Exhibit based on Gesture Image Recognition (제스처 영상 인식기반의 인터렉티브 전시용 제어기술 연구)

  • Cha, Jaesang;Kang, Joonsang;Rho, Jung-Kyu;Choi, Jungwon;Koo, Eunja
    • Journal of Satellite, Information and Communications
    • /
    • v.9 no.1
    • /
    • pp.1-5
    • /
    • 2014
  • Recently, building is rapidly develop more intelligently because of the development of industries. And people seek such as comfort, efficiency, and convenience in office environment and the living environment. Also, people were able to use a variety of devices. Smart TV and smart phones were distributed widely so interaction between devices and human has been increase the interest. A various method study for interaction but there are some discomfort and limitations using controller for interaction. In this paper, a user could be easily interaction and control LED through using Kinect and gesture(hand gestures) without controller. we designed interface which is control LED using the joint information of gesture obtained from Kinect. A user could be individually controlled LED through gestures (hand movements) using the implementation of the interface. We expected developed interface would be useful in LED control and various fields.

Gesture Recognition Using Stereo Tracking Initiator and HMM for Tele-Operation (스테레오 영상 추적 자동초기화와 HMM을 이용한 원격 작업용 제스처 인식)

  • Jeong, Ji-Won;Lee, Yong-Beom;Jin, Seong-Il
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.8
    • /
    • pp.2262-2270
    • /
    • 1999
  • In this paper, we describe gesture recognition algorithm using computer vision sensor and HMM. The automatic hand region extraction has been proposed for initializing the tracking of the tele-operation gestures. For this, distance informations(disparity map) as results of stereo matching of initial left and right images are employed to isolate the hand region from a scene. PDOE(positive difference of edges) feature images adapted here have been found to be robust against noise and background brightness. The KNU/KAERI(K/K) gesture instruction set is defined for tele-operation in atomic electric power stations. The composite recognition model constructed by concatenating three gesture instruction models including pre-orders, basic orders, and post-orders has been proposed and identified by discrete HMM. Our experimental results showed that consecutive orders composed of more than two ones are correctly recognized at the rate of above 97%.

  • PDF

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Finger-Gesture Recognition Using Concentric-Circle Tracing Algorithm (동심원 추적 알고리즘을 사용한 손가락 동작 인식)

  • Hwang, Dong-Hyun;Jang, Kyung-Sik
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.12
    • /
    • pp.2956-2962
    • /
    • 2015
  • In this paper, we propose a novel algorithm, Concentric-Circle Tracing algorithm, which recognizes finger's shape and counts the number of fingers of hand using low-cost web-camera. We improve algorithm's usability by using low-price web-camera and also enhance user's comfortability by not using a additional marker or sensor. As well as counting the number of fingers, it is possible to extract finger's shape information whether finger is straight or folded, efficiently. The experimental result shows that the finger gesture can be recognized with an average accuracy of 95.48%. It is confirmed that the hand-gesture is an useful method for HCI input and remote control command.

On-line Motion Control of Avatar Using Hand Gesture Recognition (손 제스터 인식을 이용한 실시간 아바타 자세 제어)

  • Kim, Jong-Sung;Kim, Jung-Bae;Song, Kyung-Joon;Min, Byung-Eui;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.6
    • /
    • pp.52-62
    • /
    • 1999
  • This paper presents a system which recognizes dynamic hand gestures on-line for controlling motion of numan avatar in virtual environment(VF). A dynamic hand gesture is a method of communication between a computer and a human being who uses gestures, especially both hands and fingers. A human avatar consists of 32 degree of freedom(DOF) for natural motion in VE and navigates by 8 pre-defined dynamic hand gestures. Inverse kinematics and dynamic kinematics are applied for real-time motion control of human avatar. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line dynamic hand gesture recognition.

  • PDF

Study on the Hand Gesture Recognition System and Algorithm based on Millimeter Wave Radar (밀리미터파 레이더 기반 손동작 인식 시스템 및 알고리즘에 관한 연구)

  • Lee, Youngseok
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.3
    • /
    • pp.251-256
    • /
    • 2019
  • In this paper we proposed system and algorithm to recognize hand gestures based on the millimeter wave that is in 65GHz bandwidth. The proposed system is composed of millimeter wave radar board, analog to data conversion and data capture board and notebook to perform gesture recognition algorithms. As feature vectors in proposed algorithm. we used global and local zernike moment descriptor which are robust to distort by rotation of scaling of 2D data. As Experimental result, performance of the proposed algorithm is evaluated and compared with those of algorithms using single global or local zernike descriptor as feature vectors. In analysis of confusion matrix of algorithms, the proposed algorithm shows the better performance in comparison of precision, accuracy and sensitivity, subsequently total performance index of our method is 95.6% comparing with another two mehods in 88.4% and 84%.

A Study on Hand-signal Recognition System in 37dimensional Space (3차원 공간상의 수신호 인식 시스템에 대한 연구)

  • 장효영;김대진;김정배;변증남
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.215-218
    • /
    • 2002
  • Gesture recognitions needed for various applications and is now gaining in importance as one method of enabling natural and intuitive human machine communication. In this paper, we propose a real time hand-signal recognition system in 3-dimensional space performs robust, real-time tracking under varying illumination. As compared with the existing method using classical pattern matching, this system is efficient with respect to speed and also presents more systematic way of defining hand-signals and developing a hand-signal recognition system. In order to verify the proposed method, we developed a virtual driving system operated by hand-signals.

  • PDF

Hierarchical Hand Pose Model for Hand Expression Recognition (손 표현 인식을 위한 계층적 손 자세 모델)

  • Heo, Gyeongyong;Song, Bok Deuk;Kim, Ji-Hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.10
    • /
    • pp.1323-1329
    • /
    • 2021
  • For hand expression recognition, hand pose recognition based on the static shape of the hand and hand gesture recognition based on the dynamic hand movement are used together. In this paper, we propose a hierarchical hand pose model based on finger position and shape for hand expression recognition. For hand pose recognition, a finger model representing the finger state and a hand pose model using the finger state are hierarchically constructed, which is based on the open source MediaPipe. The finger model is also hierarchically constructed using the bending of one finger and the touch of two fingers. The proposed model can be used for various applications of transmitting information through hands, and its usefulness was verified by applying it to number recognition in sign language. The proposed model is expected to have various applications in the user interface of computers other than sign language recognition.