• Title/Summary/Keyword: Gesture Recognition Systems

Search Result 126, Processing Time 0.021 seconds

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

Improvement of Gesture Recognition using 2-stage HMM (2단계 히든마코프 모델을 이용한 제스쳐의 성능향상 연구)

  • Jung, Hwon-Jae;Park, Hyeonjun;Kim, Donghan
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.11
    • /
    • pp.1034-1037
    • /
    • 2015
  • In recent years in the field of robotics, various methods have been developed to create an intimate relationship between people and robots. These methods include speech, vision, and biometrics recognition as well as gesture-based interaction. These recognition technologies are used in various wearable devices, smartphones and other electric devices for convenience. Among these technologies, gesture recognition is the most commonly used and appropriate technology for wearable devices. Gesture recognition can be classified as contact or noncontact gesture recognition. This paper proposes contact gesture recognition with IMU and EMG sensors by using the hidden Markov model (HMM) twice. Several simple behaviors make main gestures through the one-stage HMM. It is equal to the Hidden Markov model process, which is well known for pattern recognition. Additionally, the sequence of the main gestures, which comes from the one-stage HMM, creates some higher-order gestures through the two-stage HMM. In this way, more natural and intelligent gestures can be implemented through simple gestures. This advanced process can play a larger role in gesture recognition-based UX for many wearable and smart devices.

A Study on Hand Gesture Recognition using Computer Vision (컴퓨터비전을 이용한 손동작 인식에 관한 연구)

  • Park Chang-Min
    • Management & Information Systems Review
    • /
    • v.4
    • /
    • pp.395-407
    • /
    • 2000
  • It is necessary to develop method that human and computer can interfact by the hand gesture without any special device. In this thesis, the real time hand gesture recognition was developed. The system segments the region of a hand recognizes the hand posture and track the movement of the hand, using computer vision. And it does not use the blue screen as a background, the data glove and special markers for the recognition of the hand gesture.

  • PDF

Residual Learning Based CNN for Gesture Recognition in Robot Interaction

  • Han, Hua
    • Journal of Information Processing Systems
    • /
    • v.17 no.2
    • /
    • pp.385-398
    • /
    • 2021
  • The complexity of deep learning models affects the real-time performance of gesture recognition, thereby limiting the application of gesture recognition algorithms in actual scenarios. Hence, a residual learning neural network based on a deep convolutional neural network is proposed. First, small convolution kernels are used to extract the local details of gesture images. Subsequently, a shallow residual structure is built to share weights, thereby avoiding gradient disappearance or gradient explosion as the network layer deepens; consequently, the difficulty of model optimisation is simplified. Additional convolutional neural networks are used to accelerate the refinement of deep abstract features based on the spatial importance of the gesture feature distribution. Finally, a fully connected cascade softmax classifier is used to complete the gesture recognition. Compared with the dense connection multiplexing feature information network, the proposed algorithm is optimised in feature multiplexing to avoid performance fluctuations caused by feature redundancy. Experimental results from the ISOGD gesture dataset and Gesture dataset prove that the proposed algorithm affords a fast convergence speed and high accuracy.

A Hand Gesture Recognition Method using Inertial Sensor for Rapid Operation on Embedded Device

  • Lee, Sangyub;Lee, Jaekyu;Cho, Hyeonjoong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.2
    • /
    • pp.757-770
    • /
    • 2020
  • We propose a hand gesture recognition method that is compatible with a head-up display (HUD) including small processing resource. For fast link adaptation with HUD, it is necessary to rapidly process gesture recognition and send the minimum amount of driver hand gesture data from the wearable device. Therefore, we use a method that recognizes each hand gesture with an inertial measurement unit (IMU) sensor based on revised correlation matching. The method of gesture recognition is executed by calculating the correlation between every axis of the acquired data set. By classifying pre-defined gesture values and actions, the proposed method enables rapid recognition. Furthermore, we evaluate the performance of the algorithm, which can be implanted within wearable bands, requiring a minimal process load. The experimental results evaluated the feasibility and effectiveness of our decomposed correlation matching method. Furthermore, we tested the proposed algorithm to confirm the effectiveness of the system using pre-defined gestures of specific motions with a wearable platform device. The experimental results validated the feasibility and effectiveness of the proposed hand gesture recognition system. Despite being based on a very simple concept, the proposed algorithm showed good performance in recognition accuracy.

Study on gesture recognition based on IIDTW algorithm

  • Tian, Pei;Chen, Guozhen;Li, Nianfeng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.12
    • /
    • pp.6063-6079
    • /
    • 2019
  • When the length of sampling data sequence is too large, the method of gesture recognition based on traditional Dynamic Time Warping (DTW) algorithm will lead to too long calculation time, and the accuracy of recognition result is not high.Support vector machine (SVM) has some shortcomings in precision, Edit Distance on Real Sequences(EDR) algorithm does not guarantee that noise suppression will not suppress effective data.A new method based on Improved Interpolation Dynamic Time Warping (IIDTW)algorithm is proposed to improve the efficiency of gesture recognition and the accuracy of gesture recognition. The results show that the computational efficiency of IIDTW algorithm is more than twice that of SVM-DTW algorithm, the error acceptance rate is FAR reduced by 0.01%, and the error rejection rate FRR is reduced by 0.5%.Gesture recognition based on IIDTW algorithm can achieve better recognition status. If it is applied to unlock mobile phone, it is expected to become a new generation of unlock mode.

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.

Interactive visual knowledge acquisition for hand-gesture recognition (손 제스쳐 인식을 위한 상호작용 시각정보 추출)

  • 양선옥;최형일
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.9
    • /
    • pp.88-96
    • /
    • 1996
  • Computer vision-based gesture recognition systems consist of image segmentation, object tracking and decision. However, it is difficult to segment an object from image for gesture in computer systems because of vaious illuminations and backgrounds. In this paper, we describe a method to learn features for segmentation, which improves the performance of computer vision-based hand-gesture recognition systems. Systems interact with a user to acquire exact training data and segment information according to a predefined plan. System provides some models to the user, takes pictures of the user's response and then analyzes the pictures with models and a prior knowledge. The system sends messages to the user and operates learning module to extract information with the analyzed result.

  • PDF

Navigation of a Mobile Robot Using Hand Gesture Recognition (손 동작 인식을 이용한 이동로봇의 주행)

  • Kim, Il-Myeong;Kim, Wan-Cheol;Yun, Gyeong-Sik;Lee, Jang-Myeong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.7
    • /
    • pp.599-606
    • /
    • 2002
  • A new method to govern the navigation of a mobile robot using hand gesture recognition is proposed based on the following two procedures. One is to achieve vision information by using a 2-DOF camera as a communicating medium between a man and a mobile robot and the other is to analyze and to control the mobile robot according to the recognized hand gesture commands. In the previous researches, mobile robots are passively to move through landmarks, beacons, etc. In this paper, to incorporate various changes of situation, a new control system that manages the dynamical navigation of mobile robot is proposed. Moreover, without any generally used expensive equipments or complex algorithms for hand gesture recognition, a reliable hand gesture recognition system is efficiently implemented to convey the human commands to the mobile robot with a few constraints.

Gesture Recognition and Motion Evaluation Using Appearance Information of Pose in Parametric Gesture Space (파라메트릭 제스처 공간에서 포즈의 외관 정보를 이용한 제스처 인식과 동작 평가)

  • Lee, Chil-Woo;Lee, Yong-Jae
    • Journal of Korea Multimedia Society
    • /
    • v.7 no.8
    • /
    • pp.1035-1045
    • /
    • 2004
  • In this paper, we describe a method that can recognize gestures and evaluate the degree of the gestures from sequential gesture images by using Gesture Feature Space. The previous popular methods based on HMM and neural network have difficulties in recognizing the degree of gesture even though it can classify gesture into some kinds. However, our proposed method can recognize not only posture but also the degree information of the gestures, such as speed and magnitude by calculating distance among the position vectors substituting input and model images in parametric eigenspace. This method which can be applied in various applications such as intelligent interface systems and surveillance systems is a simple and robust recognition algorithm.

  • PDF