• Title/Summary/Keyword: Human Gesture Recognition

Search Result 198, Processing Time 0.021 seconds

Design and Implementation of a Stereoscopic Image Control System based on User Hand Gesture Recognition (사용자 손 제스처 인식 기반 입체 영상 제어 시스템 설계 및 구현)

  • Song, Bok Deuk;Lee, Seung-Hwan;Choi, HongKyw;Kim, Sung-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.3
    • /
    • pp.396-402
    • /
    • 2022
  • User interactions are being developed in various forms, and in particular, interactions using human gestures are being actively studied. Among them, hand gesture recognition is used as a human interface in the field of realistic media based on the 3D Hand Model. The use of interfaces based on hand gesture recognition helps users access media media more easily and conveniently. User interaction using hand gesture recognition should be able to view images by applying fast and accurate hand gesture recognition technology without restrictions on the computer environment. This paper developed a fast and accurate user hand gesture recognition algorithm using the open source media pipe framework and machine learning's k-NN (K-Nearest Neighbor). In addition, in order to minimize the restriction of the computer environment, a stereoscopic image control system based on user hand gesture recognition was designed and implemented using a web service environment capable of Internet service and a docker container, a virtual environment.

A Study on Hand Gesture Recognition using Computer Vision (컴퓨터비전을 이용한 손동작 인식에 관한 연구)

  • Park Chang-Min
    • Management & Information Systems Review
    • /
    • v.4
    • /
    • pp.395-407
    • /
    • 2000
  • It is necessary to develop method that human and computer can interfact by the hand gesture without any special device. In this thesis, the real time hand gesture recognition was developed. The system segments the region of a hand recognizes the hand posture and track the movement of the hand, using computer vision. And it does not use the blue screen as a background, the data glove and special markers for the recognition of the hand gesture.

  • PDF

A Memory-efficient Hand Segmentation Architecture for Hand Gesture Recognition in Low-power Mobile Devices

  • Choi, Sungpill;Park, Seongwook;Yoo, Hoi-Jun
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.17 no.3
    • /
    • pp.473-482
    • /
    • 2017
  • Hand gesture recognition is regarded as new Human Computer Interaction (HCI) technologies for the next generation of mobile devices. Previous hand gesture implementation requires a large memory and computation power for hand segmentation, which fails to give real-time interaction with mobile devices to users. Therefore, in this paper, we presents a low latency and memory-efficient hand segmentation architecture for natural hand gesture recognition. To obtain both high memory-efficiency and low latency, we propose a streaming hand contour tracing unit and a fast contour filling unit. As a result, it achieves 7.14 ms latency with only 34.8 KB on-chip memory, which are 1.65 times less latency and 1.68 times less on-chip memory, respectively, compare to the best-in-class.

A Decision Tree based Real-time Hand Gesture Recognition Method using Kinect

  • Chang, Guochao;Park, Jaewan;Oh, Chimin;Lee, Chilwoo
    • Journal of Korea Multimedia Society
    • /
    • v.16 no.12
    • /
    • pp.1393-1402
    • /
    • 2013
  • Hand gesture is one of the most popular communication methods in everyday life. In human-computer interaction applications, hand gesture recognition provides a natural way of communication between humans and computers. There are mainly two methods of hand gesture recognition: glove-based method and vision-based method. In this paper, we propose a vision-based hand gesture recognition method using Kinect. By using the depth information is efficient and robust to achieve the hand detection process. The finger labeling makes the system achieve pose classification according to the finger name and the relationship between each fingers. It also make the classification more effective and accutate. Two kinds of gesture sets can be recognized by our system. According to the experiment, the average accuracy of American Sign Language(ASL) number gesture set is 94.33%, and that of general gestures set is 95.01%. Since our system runs in real-time and has a high recognition rate, we can embed it into various applications.

Avatar Control by using hand gesture recognition (Hand Gesture 인식을 이용한 아바타 제어)

  • 최우영;김소연;송백균
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2004.05b
    • /
    • pp.616-619
    • /
    • 2004
  • As interests Un virtual reality being increased, the importance of HCI(Human computer interaction) field using gesture is also increased. However, in the preceding gesture recognition, the requirement of high-cost peripheral equipments limits users right. In this paper we suggest that through using low cost of USB PC-camera users are allowed to have more flexibly and cost down so that it can be adopted much commonly.

  • PDF

HandButton: Gesture Recognition of Transceiver-free Object by Using Wireless Networks

  • Zhang, Dian;Zheng, Weiling
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.2
    • /
    • pp.787-806
    • /
    • 2016
  • Traditional radio-based gesture recognition approaches usually require the target to carry a device (e.g., an EMG sensor or an accelerometer sensor). However, such requirement cannot be satisfied in many applications. For example, in smart home, users want to control the light on/off by some specific hand gesture, without finding and pressing the button especially in dark area. They will not carry any device in this scenario. To overcome this drawback, in this paper, we propose three algorithms able to recognize the target gesture (mainly the human hand gesture) without carrying any device, based on just Radio Signal Strength Indicator (RSSI). Our platform utilizes only 6 telosB sensor nodes with a very easy deployment. Experiment results show that the successful recognition radio can reach around 80% in our system.

An Analysis of Human Gesture Recognition Technologies for Electronic Device Control (전자 기기 조종을 위한 인간 동작 인식 기술 분석)

  • Choi, Min-Seok;Jang, Beakcheol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.12
    • /
    • pp.91-100
    • /
    • 2014
  • In this paper, we categorize existing human gesture recognition technologies to camera-based, additional hardware-based and frequency-based technologies. Then we describe several representative techniques for each of them, emphasizing their strengths and weaknesses. We define important performance issues for human gesture recognition technologies and analyze recent technologies according to the performance issues. Our analyses show that camera-based technologies are easy to use and have high accuracy, but they have limitations on recognition ranges and need additional costs for their devices. Additional hardware-based technologies are not limited by recognition ranges and not affected by light or noise, but they have the disadvantage that human must wear or carry additional devices and need additional costs for their devices. Finally, frequency-based technologies are not limited by recognition ranges, and they do not need additional devices. However, they have not commercialized yet, and their accuracies can be deteriorated by other frequencies and signals.

Hand Gesture Recognition for Understanding Conducting Action (지휘행동 이해를 위한 손동작 인식)

  • Je, Hong-Mo;Kim, Ji-Man;Kim, Dai-Jin
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2007.10c
    • /
    • pp.263-266
    • /
    • 2007
  • We introduce a vision-based hand gesture recognition fer understanding musical time and patterns without extra special devices. We suggest a simple and reliable vision-based hand gesture recognition having two features First, the motion-direction code is proposed, which is a quantized code for motion directions. Second, the conducting feature point (CFP) where the point of sudden motion changes is also proposed. The proposed hand gesture recognition system extracts the human hand region by segmenting the depth information generated by stereo matching of image sequences. And then, it follows the motion of the center of the gravity(COG) of the extracted hand region and generates the gesture features such as CFP and the direction-code finally, we obtain the current timing pattern of beat and tempo of the playing music. The experimental results on the test data set show that the musical time pattern and tempo recognition rate is over 86.42% for the motion histogram matching, and 79.75% fer the CFP tracking only.

  • PDF

Gesture Recognition by Analyzing a Trajetory on Spatio-Temporal Space (시공간상의 궤적 분석에 의한 제스쳐 인식)

  • 민병우;윤호섭;소정;에지마 도시야끼
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.1
    • /
    • pp.157-157
    • /
    • 1999
  • Researches on the gesture recognition have become a very interesting topic in the computer vision area, Gesture recognition from visual images has a number of potential applicationssuch as HCI (Human Computer Interaction), VR(Virtual Reality), machine vision. To overcome thetechnical barriers in visual processing, conventional approaches have employed cumbersome devicessuch as datagloves or color marked gloves. In this research, we capture gesture images without usingexternal devices and generate a gesture trajectery composed of point-tokens. The trajectory Is spottedusing phase-based velocity constraints and recognized using the discrete left-right HMM. Inputvectors to the HMM are obtained by using the LBG clustering algorithm on a polar-coordinate spacewhere point-tokens on the Cartesian space .are converted. A gesture vocabulary is composed oftwenty-two dynamic hand gestures for editing drawing elements. In our experiment, one hundred dataper gesture are collected from twenty persons, Fifty data are used for training and another fifty datafor recognition experiment. The recognition result shows about 95% recognition rate and also thepossibility that these results can be applied to several potential systems operated by gestures. Thedeveloped system is running in real time for editing basic graphic primitives in the hardwareenvironments of a Pentium-pro (200 MHz), a Matrox Meteor graphic board and a CCD camera, anda Window95 and Visual C++ software environment.

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.