• Title/Summary/Keyword: Recognition of Hand Shape

Search Result 92, Processing Time 0.025 seconds

Finger Directivity Recognition Algorithm using Shape Decomposition (형상분해를 이용한 손가락 방향성 인식 알고리즘)

  • Choi, Jong-Ho
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.3
    • /
    • pp.197-201
    • /
    • 2011
  • The use of gestures provides an attractive alternate to cumbersome interfaces for human-computer devices interaction. This has motivated a very active research area concerned with computer vision-based recognition of hand gestures. The most important issues in hand gesture recognition is to recognize the directivity of finger. The primitive elements extracted to a hand gesture include in very important information on the directivity of finger. In this paper, we propose the recognition algorithm of finger directivity by using the cross points of circle and sub-primitive element. The radius of circle is increased from minimum radius including main-primitive element to it including sub-primitive elements. Through the experiment, we demonstrated the efficiency of proposed algorithm.

Hand Region Detection and hand shape classification using Hu moment and Back Projection (역 투영과 휴 모멘트를 이용한 손영역 검출 및 모양 분류)

  • Shin, Jae-Sun;Jang, Dae-Sik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.911-914
    • /
    • 2011
  • Detecting Hand Region is essencial technology to providing User based interface and many research has been continue. In this paper will propose Hand Region Detection method by using HSV space based on Back Projection and Hand Shape Recognition using Hu Moment. By using Back Projection, I updated reliability on Hand Region Detection by Back Projection method and, Confirmed Hand Shape could be recognized through Hu moment.

  • PDF

Hand Expression Recognition for Virtual Blackboard (가상 칠판을 위한 손 표현 인식)

  • Heo, Gyeongyong;Kim, Myungja;Song, Bok Deuk;Shin, Bumjoo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.12
    • /
    • pp.1770-1776
    • /
    • 2021
  • For hand expression recognition, hand pose recognition based on the static shape of the hand and hand gesture recognition based on hand movement are used together. In this paper, we proposed a hand expression recognition method that recognizes symbols based on the trajectory of a hand movement on a virtual blackboard. In order to recognize a sign drawn by hand on a virtual blackboard, not only a method of recognizing a sign from a hand movement, but also hand pose recognition for finding the start and end of data input is also required. In this paper, MediaPipe was used to recognize hand pose, and LSTM(Long Short Term Memory), a type of recurrent neural network, was used to recognize hand gesture from time series data. To verify the effectiveness of the proposed method, it was applied to the recognition of numbers written on a virtual blackboard, and a recognition rate of about 94% was obtained.

A Study on Tangible Gesture Interface Prototype Development of the Quiz Game (퀴즈게임의 체감형 제스처 인터페이스 프로토타입 개발)

  • Ahn, Jung-Ho;Ko, Jae-Pil
    • Journal of Digital Contents Society
    • /
    • v.13 no.2
    • /
    • pp.235-245
    • /
    • 2012
  • This paper introduce a quiz game contents based on gesture interface. We analyzed the off-line quiz games, extracted its presiding components, and digitalized them so that the proposed game contents is able to substitute for the off-line quiz games. We used the Kinect camera to obtain the depth images and performed the preprocessing including vertical human segmentation, head detection and tracking and hand detection, and gesture recognition for hand-up, hand vertical movement, fist shape, pass and fist-and-attraction. Especially, we defined the interface gestures designed as a metaphor for natural gestures in real world so that users are able to feel abstract concept of movement, selection and confirmation tangibly. Compared to our previous work, we added the card compensation process for completeness, improved the vertical hand movement and the fist shape recognition methods for the example selection and presented an organized test to measure the recognition performance. The implemented quiz application program was tested in real time and showed very satisfactory gesture recognition results.

The Effect of Visual Feedback on One-hand Gesture Performance in Vision-based Gesture Recognition System

  • Kim, Jun-Ho;Lim, Ji-Hyoun;Moon, Sung-Hyun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.551-556
    • /
    • 2012
  • Objective: This study presents the effect of visual feedback on one-hand gesture performance in vision-based gesture recognition system when people use gestures to control a screen device remotely. Backgroud: gesture interaction receives growing attention because it uses advanced sensor technology and it allows users natural interaction using their own body motion. In generating motion, visual feedback has been to considered critical factor affect speed and accuracy. Method: three types of visual feedback(arrow, star, and animation) were selected and 20 gestures were listed. 12 participants perform each 20 gestures while given 3 types of visual feedback in turn. Results: People made longer hand trace and take longer time to make a gesture when they were given arrow shape feedback than star-shape feedback. The animation type feedback was most preferred. Conclusion: The type of visual feedback showed statistically significant effect on the length of hand trace, elapsed time, and speed of motion in performing a gesture. Application: This study could be applied to any device that needs visual feedback for device control. A big feedback generate shorter length of motion trace, less time, faster than smaller one when people performs gestures to control a device. So the big size of visual feedback would be recommended for a situation requiring fast actions. On the other hand, the smaller visual feedback would be recommended for a situation requiring elaborated actions.

A Vision-Based Method to Find Fingertips in a Closed Hand

  • Chaudhary, Ankit;Vatwani, Kapil;Agrawal, Tushar;Raheja, J.L.
    • Journal of Information Processing Systems
    • /
    • v.8 no.3
    • /
    • pp.399-408
    • /
    • 2012
  • Hand gesture recognition is an important area of research in the field of Human Computer Interaction (HCI). The geometric attributes of the hand play an important role in hand shape reconstruction and gesture recognition. That said, fingertips are one of the important attributes for the detection of hand gestures and can provide valuable information from hand images. Many methods are available in scientific literature for fingertips detection with an open hand but very poor results are available for fingertips detection when the hand is closed. This paper presents a new method for the detection of fingertips in a closed hand using the corner detection method and an advanced edge detection algorithm. It is important to note that the skin color segmentation methodology did not work for fingertips detection in a closed hand. Thus the proposed method applied Gabor filter techniques for the detection of edges and then applied the corner detection algorithm for the detection of fingertips through the edges. To check the accuracy of the method, this method was tested on a vast number of images taken with a webcam. The method resulted in a higher accuracy rate of detections from the images. The method was further implemented on video for testing its validity on real time image capturing. These closed hand fingertips detection would help in controlling an electro-mechanical robotic hand via hand gesture in a natural way.

Hand Shape Detection and Recognition using Self Organized Feature Map(SOMF) and Principal Component Analysis (자기 조직화 특징 지도(SOFM)와 주성분 분석을 이용한 손 형상 검출 및 인식)

  • Kim, Kyoung-Ho;Lee, Kee-Jun
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.11
    • /
    • pp.28-36
    • /
    • 2013
  • This study proposed a robust detection algorithm. It detects hands more stably with respect to changes in light and rotation for the identification of a hand shape. Also it satisfies both efficiency of calculation and the function of detection. The algorithm proposed segmented the hand area through pre-processing using a hand shape as input information in an environment with a single camera and then identified the shape using a Self Organized Feature Map(SOFM). However, as it is not easy to exactly recognize a hand area which is sensitive to light, it has a large degree of freedom, and there is a large error bound, to enhance the identification rate, rotation information on the hand shape was made into a database and then a principal component analysis was conducted. Also, as there were fewer calculations due to the fewer dimensions, the time for real-time identification could be decreased.

NATURAL INTERACTION WITH VIRTUAL PET ON YOUR PALM

  • Choi, Jun-Yeong;Han, Jae-Hyek;Seo, Byung-Kuk;Park, Han-Hoon;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.341-345
    • /
    • 2009
  • We present an augmented reality (AR) application for cell phone where users put a virtual pet on their palms and play/interact with the pet by moving their hands and fingers naturally. The application is fundamentally based on hand/palm pose recognition and finger motion estimation, which is the main concern in this paper. We propose a fast and efficient hand/palm pose recognition method which uses natural features (e.g. direction, width, contour shape of hand region) extracted from a hand image with prior knowledge for hand shape or geometry (e.g. its approximated shape when a palm is open, length ratio between palm width and pal height). We also propose a natural interaction method which recognizes natural motion of fingers such as opening/closing palm based on fingertip tracking. Based on the proposed methods, we developed and tested the AR application on an ultra-mobile PC (UMPC).

  • PDF

Automatic Recognition of Hand-written Hangout by the Phase Rotation (위상회전에 의한 필기체 한글의 자동인식)

  • 이주근;김홍기
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.13 no.1
    • /
    • pp.23-30
    • /
    • 1976
  • In this paper, a method is proposed for the recognition of hand-written Hangeul. This is peiformed by extraction of the concave structural segments by phase rotation. Character patterns can be decomposed into the fundamental concave structural segments which are also categorized into segment sects, and the closure and phase features of each segment in set is represented by logics. By rotating the logic pattern, the topological and phase features of segment are extracted for the reliable recognition of the character. It is also evaluated that this method applies to a wide variety of shape, position and declination of the character.

  • PDF

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.