• 제목/요약/키워드: Hand gesture

검색결과 406건 처리시간 0.033초

신경회로망을 이용한 동적 손 제스처 인식에 관한 연구 (A Study on Dynamic Hand Gesture Recognition Using Neural Networks)

  • 조인석;박진현;최영규
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제53권1호
    • /
    • pp.22-31
    • /
    • 2004
  • This paper deals with the dynamic hand gesture recognition based on computer vision using neural networks. This paper proposes a global search method and a local search method to recognize the hand gesture. The global search recognizes a hand among the hand candidates through the entire image search, and the local search recognizes and tracks only the hand through the block search. Dynamic hand gesture recognition method is based on the skin-color and shape analysis with the invariant moment and direction information. Starting point and ending point of the dynamic hand gesture are obtained from hand shape. Experiments have been conducted for hand extraction, hand recognition and dynamic hand gesture recognition. Experimental results show the validity of the proposed method.

Three Dimensional Hand Gesture Taxonomy for Commands

  • Choi, Eun-Jung;Lee, Dong-Hun;Chung, Min-K.
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.483-492
    • /
    • 2012
  • Objective: The aim of this study is to suggest three-dimensional(3D) hand gesture taxonomy to organize the user's intention of his/her decisions on deriving a certain gesture systematically. Background: With advanced technologies of gesture recognition, various researchers have studied to focus on deriving intuitive gestures for commands from users. In most of the previous studies, the users' reasons for deriving a certain gesture for a command were only used as a reference to group various gestures. Method: A total of eleven studies which categorized gestures accompanied by speech were investigated. Also a case study with thirty participants was conducted to understand gesture-features which derived from the users specifically. Results: Through the literature review, a total of nine gesture-features were extracted. After conducting the case study, the nine gesture-features were narrowed down a total of seven gesture-features. Conclusion: Three-dimensional hand gesture taxonomy including a total of seven gesture-features was developed. Application: Three-dimensional hand gesture taxonomy might be used as a check list to understand the users' reasons.

A Notation Method for Three Dimensional Hand Gesture

  • Choi, Eun-Jung;Kim, Hee-Jin;Chung, Min-K.
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.541-550
    • /
    • 2012
  • Objective: The aim of this study is to suggest a notation method for three-dimensional hand gesture. Background: To match intuitive gestures with commands of products, various studies have tried to derive gestures from users. In this case, various gestures for a command are derived due to various users' experience. Thus, organizing the gestures systematically and identifying similar pattern of them have become one of important issues. Method: Related studies about gesture taxonomy and notating sign language were investigated. Results: Through the literature review, a total of five elements of static gesture were selected, and a total of three forms of dynamic gesture were identified. Also temporal variability(reputation) was additionally selected. Conclusion: A notation method which follows a combination sequence of the gesture elements was suggested. Application: A notation method for three dimensional hand gestures might be used to describe and organize the user-defined gesture systematically.

주거 공간에서의 3차원 핸드 제스처 인터페이스에 대한 사용자 요구사항 (User Needs of Three Dimensional Hand Gesture Interfaces in Residential Environment Based on Diary Method)

  • 정동영;김희진;한성호;이동훈
    • 대한산업공학회지
    • /
    • 제41권5호
    • /
    • pp.461-469
    • /
    • 2015
  • The aim of this study is to find out the user's needs of a 3D hand gesture interface in the smart home environment. To find out the users' needs, we investigated which object the users want to use with a 3D hand gesture interface and why they want to use a 3D hand gesture interface. 3D hand gesture interfaces are studied to be applied to various devices in the smart environment. 3D hand gesture interfaces enable the users to control the smart environment with natural and intuitive hand gestures. With these advantages, finding out the user's needs of a 3D hand gesture interface would improve the user experience of a product. This study was conducted using a diary method to find out the user's needs with 20 participants. They wrote the needs of a 3D hand gesture interface during one week filling in the forms of a diary. The form of the diary is comprised of who, when, where, what and how to use a 3D hand gesture interface with each consisting of a usefulness score. A total of 322 data (209 normal data and 113 error data) were collected from users. There were some common objects which the users wanted to control with a 3D hand gesture interface and reasons why they want to use a 3D hand gesture interface. Among them, the users wanted to use a 3D hand gesture interface mostly to control the light, and to use a 3D hand gesture interface mostly to overcome hand restrictions. The results of this study would help develop effective and efficient studies of a 3D hand gesture interface giving valuable insights for the researchers and designers. In addition, this could be used for creating guidelines for 3D hand gesture interfaces.

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제15권3호
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.

HSFE Network and Fusion Model based Dynamic Hand Gesture Recognition

  • Tai, Do Nhu;Na, In Seop;Kim, Soo Hyung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제14권9호
    • /
    • pp.3924-3940
    • /
    • 2020
  • Dynamic hand gesture recognition(d-HGR) plays an important role in human-computer interaction(HCI) system. With the growth of hand-pose estimation as well as 3D depth sensors, depth, and the hand-skeleton dataset is proposed to bring much research in depth and 3D hand skeleton approaches. However, it is still a challenging problem due to the low resolution, higher complexity, and self-occlusion. In this paper, we propose a hand-shape feature extraction(HSFE) network to produce robust hand-shapes. We build a hand-shape model, and hand-skeleton based on LSTM to exploit the temporal information from hand-shape and motion changes. Fusion between two models brings the best accuracy in dynamic hand gesture (DHG) dataset.

컴퓨터비전을 이용한 손동작 인식에 관한 연구 (A Study on Hand Gesture Recognition using Computer Vision)

  • 박창민
    • 경영과정보연구
    • /
    • 제4권
    • /
    • pp.395-407
    • /
    • 2000
  • It is necessary to develop method that human and computer can interfact by the hand gesture without any special device. In this thesis, the real time hand gesture recognition was developed. The system segments the region of a hand recognizes the hand posture and track the movement of the hand, using computer vision. And it does not use the blue screen as a background, the data glove and special markers for the recognition of the hand gesture.

  • PDF

다변량 퍼지 의사결정트리와 사용자 적응을 이용한 손동작 인식 (Hand Gesture Recognition using Multivariate Fuzzy Decision Tree and User Adaptation)

  • 전문진;도준형;이상완;박광현;변증남
    • 로봇학회논문지
    • /
    • 제3권2호
    • /
    • pp.81-90
    • /
    • 2008
  • While increasing demand of the service for the disabled and the elderly people, assistive technologies have been developed rapidly. The natural signal of human such as voice or gesture has been applied to the system for assisting the disabled and the elderly people. As an example of such kind of human robot interface, the Soft Remote Control System has been developed by HWRS-ERC in $KAIST^[1]$. This system is a vision-based hand gesture recognition system for controlling home appliances such as television, lamp and curtain. One of the most important technologies of the system is the hand gesture recognition algorithm. The frequently occurred problems which lower the recognition rate of hand gesture are inter-person variation and intra-person variation. Intra-person variation can be handled by inducing fuzzy concept. In this paper, we propose multivariate fuzzy decision tree(MFDT) learning and classification algorithm for hand motion recognition. To recognize hand gesture of a new user, the most proper recognition model among several well trained models is selected using model selection algorithm and incrementally adapted to the user's hand gesture. For the general performance of MFDT as a classifier, we show classification rate using the benchmark data of the UCI repository. For the performance of hand gesture recognition, we tested using hand gesture data which is collected from 10 people for 15 days. The experimental results show that the classification and user adaptation performance of proposed algorithm is better than general fuzzy decision tree.

  • PDF

다문화 손동작 인식을 위한 HOG-HOD 알고리즘 (HOG-HOD Algorithm for Recognition of Multi-cultural Hand Gestures)

  • 김지예;박종일
    • 한국멀티미디어학회논문지
    • /
    • 제20권8호
    • /
    • pp.1187-1199
    • /
    • 2017
  • In recent years, research about Natural User Interface (NUI) has become focused because NUI system can give natural feelings for users in virtual reality. Most important thing in NUI system is how to communicate with the computer system. There are many things to interact with users such as speech, hand gestures, body actions. Among them, hand gesture is suitable for the purpose of NUI because people often use a relatively high frequency in daily life and hand gesture have meaning only by itself. This hand gestures called multi-cultural hand gesture and we proposed the method to recognize this kind of hand gestures. Proposed method is composed of Histogram of Oriented Gradients (HOG) used for hand shape recognition and Histogram of Oriented Displacements (HOD) used for hand center point trajectory recognition.

Hand Gesture Recognition using Improved Hidden Markov Models

  • Xu, Wenkai;Lee, Eung-Joo
    • 한국멀티미디어학회논문지
    • /
    • 제14권7호
    • /
    • pp.866-871
    • /
    • 2011
  • In this paper, an improved method of hand detecting and hand gesture recognition is proposed, it can be applied in different illumination condition and complex background. We use Adaptive Skin Threshold (AST) to detect the areas of hand. Then the result of hand detection is used to hand recognition through the improved HMM algorithm. At last, we design a simple program using the result of hand recognition for recognizing "stone, scissors, cloth" these three kinds of hand gesture. Experimental results had proved that the hand and gesture can be detected and recognized with high average recognition rate (92.41%) and better than some other methods such as syntactical analysis, neural based approach by using our approach.