• Title/Summary/Keyword: Gesture Control

Search Result 187, Processing Time 0.033 seconds

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.

Hand gesture recognition for player control

  • Shi, Lan Yan;Kim, Jin-Gyu;Yeom, Dong-Hae;Joo, Young-Hoon
    • Proceedings of the KIEE Conference
    • /
    • 2011.07a
    • /
    • pp.1908-1909
    • /
    • 2011
  • Hand gesture recognition has been widely used in virtual reality and HCI (Human-Computer-Interaction) system, which is challenging and interesting subject in the vision based area. The existing approaches for vision-driven interactive user interfaces resort to technologies such as head tracking, face and facial expression recognition, eye tracking and gesture recognition. The purpose of this paper is to combine the finite state machine (FSM) and the gesture recognition method, in other to control Windows Media Player, such as: play/pause, next, pervious, and volume up/down.

  • PDF

Navigation of a Mobile Robot Using the Hand Gesture Recognition

  • Kim, Il-Myung;Kim, Wan-Cheol;Yun, Jae-Mu;Jin, Tae-Seok;Lee, Jang-Myung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.126.3-126
    • /
    • 2001
  • A new method to govern the navigation of a mobile robot is proposed based on the following two procedures: one is to achieve vision information by using a 2 D-O-F camera as a communicating medium between a man and a mobile robot and the other is to analyze and to behave according to the recognized hand gesture commands. In the previous researches, mobile robots are passively to move through landmarks, beacons, etc. To incorporate various changes of situation, a new control system manages the dynamical navigation of a mobile robot. Moreover, without any generally used expensive equipments or complex algorithms for hand gesture recognition, a reliable hand gesture recognition system is efficiently implemented to convey the human commands to the mobile robot with a few constraints.

  • PDF

A motion control of robot manipulator by hand glove gesture (손동작 인식 로봇 동작 제어)

  • An, Hyo-min;Lee, Yong-Gyu;Kim, Hyung-Jong;Hyun, Woong-Keun
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.566-569
    • /
    • 2022
  • In this paper, the algorithm was developed to recognize hand golve gesture and implemented a system to remotely control the robot. The system consists of a camera and a controller that controls robot motion by hand position gesture. The camera recognizes the specific color of the glove and outputs the recognized range and position by including the color area of the glove. We recognize the velocity vector of robot motion and control the robot by the output data of the position and the detected rectangle. Through the several experiments, it was confirmed that the robot motion control was successfully performed.

  • PDF

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

Comparison of Gesture Characteristics of Career Teachers and Novice Teachers in Elementary Science Class - Focused on the 5th Grade Unit of the Function and Structure of Our Body - (초등과학 수업에서 경력교사와 초보교사의 제스처 특징 비교 - 우리 몸의 구조와 기능 단원을 중심으로 -)

  • Jeong, Jun Yong;Shin, Dong Hoon
    • Journal of Korean Elementary Science Education
    • /
    • v.37 no.3
    • /
    • pp.296-308
    • /
    • 2018
  • The purpose of this study is to analyze the characteristics and differences of gesture between career teachers and novice teachers in elementary science class. In order to analyze the gesture of elementary science teachers, gesture analysis framework was developed. The teachers who participated in the experiment were 2 beginner teachers and 2 career teachers. We analyzed 'bones and muscles', 'digestion', 'breathing', and 'excretion' of 'body' section in the second semester of 5th grade. The video recording of the class scene with the camcorder was recorded and analyzed by Observer XT. The results of this study are summarized as follows. First, the career teacher lessens unnecessary gestures than the novice teacher. During the class, the career teachers lessened the gestures not related to the context of the class. These differences were more prominent in the activities of the group with many unexpected situations than those of the teachers who prepared the class contents. Second, career teachers have more communication control act than novice teachers. Career teachers have often made adjustments to induce learner presentations or control unnecessary utterances. Third, career teachers efficiently interacted with learners using gestures that can enhance communication.

Object Detection Using Predefined Gesture and Tracking (약속된 제스처를 이용한 객체 인식 및 추적)

  • Bae, Dae-Hee;Yi, Joon-Hwan
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.10
    • /
    • pp.43-53
    • /
    • 2012
  • In the this paper, a gesture-based user interface based on object detection using predefined gesture and the tracking of the detected object is proposed. For object detection, moving objects in a frame are computed by comparing multiple previous frames and predefined gesture is used to detect the target object among those moving objects. Any object with the predefined gesture can be used to control. We also propose an object tracking algorithm, namely density based meanshift algorithm, that uses color distribution of the target objects. The proposed object tracking algorithm tracks a target object crossing the background with a similar color more accurately than existing techniques. Experimental results show that the proposed object detection and tracking algorithms achieve higher detection capability with less computational complexity.

The Study on Gesture Recognition for Fighting Games based on Kinect Sensor (키넥트 센서 기반 격투액션 게임을 위한 제스처 인식에 관한 연구)

  • Kim, Jong-Min;Kim, Eun-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.552-555
    • /
    • 2018
  • This study developed a gesture recognition method using Kinect sensor and proposed a fighting action control interface. To extract the pattern features of a gesture, it used a method of extracting them in consideration of a body rate based on the shoulders, rather than of absolute positions. Although the same gesture is made, the positional coordinates of each joint caught by Kinect sensor can be different depending on a length and direction of the arm. Therefore, this study applied principal component analysis in order for gesture modeling and analysis. The method helps to reduce the effects of data errors and bring about dimensional contraction effect. In addition, this study proposed a modified matching algorithm to reduce motion restrictions of gesture recognition system.

  • PDF

Hand Gesture Recognition using DP Matching from USB Camera Video (USB 카메라 영상에서 DP 매칭을 이용한 사용자의 손 동작 인식)

  • Ha, Jin-Young;Byeon, Min-Woo;Kim, Jin-Sik
    • Journal of Industrial Technology
    • /
    • v.29 no.A
    • /
    • pp.47-54
    • /
    • 2009
  • In this paper, we proposed hand detection and hand gesture recognition from USB camera video. Firstly, we extract hand region extraction using skin color information from a difference images. Background image is initially stored and extracted from the input images in order to reduce problems from complex backgrounds. After that, 16-directional chain code sequence is computed from the tracking of hand motion. These chain code sequences are compared with pre-trained models using DP matching. Our hand gesture recognition system can be used to control PowerPoint slides or applied to multimedia education systems. We got 92% hand region extraction accuracy and 82.5% gesture recognition accuracy, respectively.

  • PDF

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.