• Title/Summary/Keyword: Hand Gesture Interface

Search Result 115, Processing Time 0.026 seconds

Alphabetical Gesture Recognition using HMM (HMM을 이용한 알파벳 제스처 인식)

  • Yoon, Ho-Sub;Soh, Jung;Min, Byung-Woo
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 1998.10c
    • /
    • pp.384-386
    • /
    • 1998
  • The use of hand gesture provides an attractive alternative to cumbersome interface devices for human-computer interaction(HCI). Many methods hand gesture recognition using visual analysis have been proposed such as syntactical analysis, neural network(NN), Hidden Markov Model(HMM) and so on. In our research, a HMMs is proposed for alphabetical hand gesture recognition. In the preprocessing stage, the proposed approach consists of three different procedures for hand localization, hand tracking and gesture spotting. The hand location procedure detects the candidated regions on the basis of skin-color and motion in an image by using a color histogram matching and time-varying edge difference techniques. The hand tracking algorithm finds the centroid of a moving hand region, connect those centroids, and thus, produces a trajectory. The spotting a feature database, the proposed approach use the mesh feature code for codebook of HMM. In our experiments, 1300 alphabetical and 1300 untrained gestures are used for training and testing, respectively. Those experimental results demonstrate that the proposed approach yields a higher and satisfying recognition rate for the images with different sizes, shapes and skew angles.

  • PDF

Design and Implementation of a Stereoscopic Image Control System based on User Hand Gesture Recognition (사용자 손 제스처 인식 기반 입체 영상 제어 시스템 설계 및 구현)

  • Song, Bok Deuk;Lee, Seung-Hwan;Choi, HongKyw;Kim, Sung-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.3
    • /
    • pp.396-402
    • /
    • 2022
  • User interactions are being developed in various forms, and in particular, interactions using human gestures are being actively studied. Among them, hand gesture recognition is used as a human interface in the field of realistic media based on the 3D Hand Model. The use of interfaces based on hand gesture recognition helps users access media media more easily and conveniently. User interaction using hand gesture recognition should be able to view images by applying fast and accurate hand gesture recognition technology without restrictions on the computer environment. This paper developed a fast and accurate user hand gesture recognition algorithm using the open source media pipe framework and machine learning's k-NN (K-Nearest Neighbor). In addition, in order to minimize the restriction of the computer environment, a stereoscopic image control system based on user hand gesture recognition was designed and implemented using a web service environment capable of Internet service and a docker container, a virtual environment.

Gesture Recognition based on Mixture-of-Experts for Wearable User Interface of Immersive Virtual Reality (몰입형 가상현실의 착용식 사용자 인터페이스를 위한 Mixture-of-Experts 기반 제스처 인식)

  • Yoon, Jong-Won;Min, Jun-Ki;Cho, Sung-Bae
    • Journal of the HCI Society of Korea
    • /
    • v.6 no.1
    • /
    • pp.1-8
    • /
    • 2011
  • As virtual realty has become an issue of providing immersive services, in the area of virtual realty, it has been actively investigated to develop user interfaces for immersive interaction. In this paper, we propose a gesture recognition based immersive user interface by using an IR LED embedded helmet and data gloves in order to reflect the user's movements to the virtual reality environments effectively. The system recognizes the user's head movements by using the IR LED embedded helmet and IR signal transmitter, and the hand gestures with the data gathered from data gloves. In case of hand gestures recognition, it is difficult to recognize accurately with the general recognition model because there are various hand gestures since human hands consist of many articulations and users have different hand sizes and hand movements. In this paper, we applied the Mixture-of-Experts based gesture recognition for various hand gestures of multiple users accurately. The movement of the user's head is used to change the perspection in the virtual environment matching to the movement in the real world, and the gesture of the user's hand can be used as inputs in the virtual environment. A head mounted display (HMD) can be used with the proposed system to make the user absorbed in the virtual environment. In order to evaluate the usefulness of the proposed interface, we developed an interface for the virtual orchestra environment. The experiment verified that the user can use the system easily and intuituvely with being entertained.

  • PDF

Implementing Leap-Motion-Based Interface for Enhancing the Realism of Shooter Games (슈팅 게임의 현실감 개선을 위한 립모션 기반 인터페이스 구현)

  • Shin, Inho;Cheon, Donghun;Park, Hanhoon
    • Journal of the HCI Society of Korea
    • /
    • v.11 no.1
    • /
    • pp.5-10
    • /
    • 2016
  • This paper aims at providing a shooter game interface which enhances the game's realism by recognizing user's hand gestures using the Leap Motion. In this paper, we implemented the functions such as shooting, moving, viewpoint change, and zoom in/out, which are necessary in shooter games, and confirmed through user test that the game interface using familiar and intuitive hand gestures is superior to the conventional mouse/keyboard in terms of ease-to-manipulation, interest, extendability, and so on. Specifically, the user satisfaction index(1~5) was 3.02 on average when using the mouse/keyboard interface and 3.57 on average when using the proposed hand gesture interface.

A Study on Gesture Interface through User Experience (사용자 경험을 통한 제스처 인터페이스에 관한 연구)

  • Yoon, Ki Tae;Cho, Eel Hea;Lee, Jooyoup
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.6
    • /
    • pp.839-849
    • /
    • 2017
  • Recently, the role of the kitchen has evolved from the space for previous survival to the space that shows the present life and culture. Along with these changes, the use of IoT technology is spreading. As a result, the development and diffusion of new smart devices in the kitchen is being achieved. The user experience for using these smart devices is also becoming important. For a natural interaction between a user and a computer, better interactions can be expected based on context awareness. This paper examines the Natural User Interface (NUI) that does not touch the device based on the user interface (UI) of the smart device used in the kitchen. In this method, we use the image processing technology to recognize the user's hand gesture using the camera attached to the device and apply the recognized hand shape to the interface. The gestures used in this study are proposed to gesture according to the user's context and situation, and 5 kinds of gestures are classified and used in the interface.

MPEG-U-based Advanced User Interaction Interface Using Hand Posture Recognition

  • Han, Gukhee;Choi, Haechul
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.4
    • /
    • pp.267-273
    • /
    • 2016
  • Hand posture recognition is an important technique to enable a natural and familiar interface in the human-computer interaction (HCI) field. This paper introduces a hand posture recognition method using a depth camera. Moreover, the hand posture recognition method is incorporated with the Moving Picture Experts Group Rich Media User Interface (MPEG-U) Advanced User Interaction (AUI) Interface (MPEG-U part 2), which can provide a natural interface on a variety of devices. The proposed method initially detects positions and lengths of all fingers opened, and then recognizes the hand posture from the pose of one or two hands, as well as the number of fingers folded when a user presents a gesture representing a pattern in the AUI data format specified in MPEG-U part 2. The AUI interface represents a user's hand posture in the compliant MPEG-U schema structure. Experimental results demonstrate the performance of the hand posture recognition system and verified that the AUI interface is compatible with the MPEG-U standard.

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

Implementation of DID interface using gesture recognition (제스쳐 인식을 이용한 DID 인터페이스 구현)

  • Lee, Sang-Hun;Kim, Dae-Jin;Choi, Hong-Sub
    • Journal of Digital Contents Society
    • /
    • v.13 no.3
    • /
    • pp.343-352
    • /
    • 2012
  • In this paper, we implemented a touchless interface for DID(Digital Information Display) system using gesture recognition technique which includes both hand motion and hand shape recognition. Especially this touchless interface without extra attachments gives user both easier usage and spatial convenience. For hand motion recognition, two hand-motion's parameters such as a slope and a velocity were measured as a direction-based recognition way. And extraction of hand area image utilizing YCbCr color model and several image processing methods were adopted to recognize a hand shape recognition. These recognition methods are combined to generate various commands, such as, next-page, previous-page, screen-up, screen-down and mouse -click in oder to control DID system. Finally, experimental results showed the performance of 93% command recognition rate which is enough to confirm the possible application to commercial products.

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

Vision-based hand Gesture Detection and Tracking System (비전 기반의 손동작 검출 및 추적 시스템)

  • Park Ho-Sik;Bae Cheol-soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1175-1180
    • /
    • 2005
  • We present a vision-based hand gesture detection and tracking system. Most conventional hand gesture recognition systems utilize a simpler method for hand detection such as background subtractions with assumed static observation conditions and those methods are not robust against camera motions, illumination changes, and so on. Therefore, we propose a statistical method to recognize and detect hand regions in images using geometrical structures. Also, Our hand tracking system employs multiple cameras to reduce occlusion problems and non-synchronous multiple observations enhance system scalability. In this experiment, the proposed method has recognition rate of $99.28\%$ that shows more improved $3.91\%$ than the conventional appearance method.