• Title/Summary/Keyword: One-hand gesture

Search Result 71, Processing Time 0.026 seconds

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

A Study on Hand-signal Recognition System in 37dimensional Space (3차원 공간상의 수신호 인식 시스템에 대한 연구)

  • 장효영;김대진;김정배;변증남
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.215-218
    • /
    • 2002
  • Gesture recognitions needed for various applications and is now gaining in importance as one method of enabling natural and intuitive human machine communication. In this paper, we propose a real time hand-signal recognition system in 3-dimensional space performs robust, real-time tracking under varying illumination. As compared with the existing method using classical pattern matching, this system is efficient with respect to speed and also presents more systematic way of defining hand-signals and developing a hand-signal recognition system. In order to verify the proposed method, we developed a virtual driving system operated by hand-signals.

  • PDF

An Application of AdaBoost Learning Algorithm and Kalman Filter to Hand Detection and Tracking (AdaBoost 학습 알고리즘과 칼만 필터를 이용한 손 영역 탐지 및 추적)

  • Kim, Byeong-Man;Kim, Jun-Woo;Lee, Kwang-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.4 s.36
    • /
    • pp.47-56
    • /
    • 2005
  • With the development of wearable(ubiquitous) computers, those traditional interfaces between human and computers gradually become uncomfortable to use, which directly leads to a requirement for new one. In this paper, we study on a new interface in which computers try to recognize the gesture of human through a digital camera. Because the method of recognizing hand gesture through camera is affected by the surrounding environment such as lighting and so on, the detector should be a little sensitive. Recently, Viola's detector shows a favorable result in face detection. where Adaboost learning algorithm is used with the Haar features from the integral image. We apply this method to hand area detection and carry out comparative experiments with the classic method using skin color. Experimental results show Viola's detector is more robust than the detection method using skin color in the environment that degradation may occur by surroundings like effect of lighting.

  • PDF

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Presentation Control System using Gesture Recognition and Sensor (제스처 인식과 센서를 이용한 프레젠테이션 제어 시스템)

  • Chang, Moon-Soo;Kwak, Sun-Dong;Kang, Sun-Mee
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.4
    • /
    • pp.481-486
    • /
    • 2011
  • Recently, most presentations have been presented on the screen using the computer. This paper suggests that the computer can be controlled by the gesturing recognition, without the help of any person or tools. If we use only information in the form of images, we should have a high-resolution camera for capturing the images and a computer that can treat high-resolution images. However, this paper will present a solution whereby a low-resolution camera can be used at the stage. It uses the supersonic sensor to trace the presenter's location and a low-resolution camera for capturing the necessary limited and small area. The gesture is defined by the number of fingers and one's hand positions which are recognized by the Erosion / Dilation and Subtraction algorithm. The system this paper addresses has improved 13%, when comparing tests between the image-only data system and this paper's system. The gesture recognition tests have a 98% success rate.

A Study on the Interaction with Virtual Objects through XR Hands (XR Hands를 통한 가상 객체들과의 상호 작용에 관한 연구)

  • BeomJun Jo;SeongKi Kim
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.43-49
    • /
    • 2024
  • Hand tracking is currently one of the most promising technologies in XR with the release of extended reality (XR) devices, in which hand tracking is used as the main manipulation. Hand tracking offers advantages in terms of immersion and realism, and as a result, it is being employed in a range of fields, including education, business, and medical care. The archery movement requires using both hands at the same time, but requires sophistication to hit the target and is a movement that has cultural and sports significance in the past. This study aimed to implement this archery movement. Therefore, this paper used the XR Hands package provided by Unity to recognize hand movements, explored the underlying OpenXR, and finally implemented the archery movement and tested it in Meta Quest 2.

Augmented Reality Game Interface Using Hand Gestures Tracking (사용자 손동작 추적에 기반한 증강현실 게임 인터페이스)

  • Yoon, Jong-Hyun;Park, Jong-Seung
    • Journal of Korea Game Society
    • /
    • v.6 no.2
    • /
    • pp.3-12
    • /
    • 2006
  • Recently, Many 3D augmented reality games that provide strengthened immersive have appeared in the 3D game environment. In this article, we describe a barehanded interaction method based on human hand gestures for augmented reality games. First, feature points are extracted from input video streams. Point features are tracked and motion of moving objects are computed. The shape of the motion trajectories are used to determine whether the motion is intended gestures. A long smooth trajectory toward one of virtual objects or menus is classified as an intended gesture and the corresponding action is invoked. To prove the validity of the proposed method, we implemented two simple augmented reality applications: a gesture-based music player and a virtual basketball game. In the music player, several menu icons are displayed on the top of the screen and an user can activate a menu by hand gestures. In the virtual basketball game, a virtual ball is bouncing in a virtual cube space and the real video stream is shown in the background. An user can hit the virtual ball with his hand gestures. From the experiments for three untrained users, it is shown that the accuracy of menu activation according to the intended gestures is 94% for normal speed gestures and 84% for fast and abrupt gestures.

  • PDF

Interactive VR film Storytelling in isolated space

  • Kim, Tae-Eun
    • International journal of advanced smart convergence
    • /
    • v.9 no.1
    • /
    • pp.163-171
    • /
    • 2020
  • There are many differences in narrative delivery between common movies and Virtual Reality(VR) films due to their differences in the appreciation structure. In VR films, scene changes by cuts have hindered the immersion of the audience instead of promoting narrative delivery. There are a range of experiments on narratives and immersion to solve this issue in VR films. Floating Tent applies hand gestures and immersive effects found in game elements and does not disturb narrative delivery by setting proper spaces and employing a direction technique to enable the melting of narratives into the characteristics of the spaces. There are time limits to offsound and mission performance, and devices fit for apocalyptic spatial expressions are made through a program. One of measures for the increasingly growing interactive storytelling in VR films is effective immersion. In narrative delivery, it is important to consider spatial setting and immersion to enable active intervention into events for immersion rather than passive audience only supposed to watch characters' acting.

Hierarchical Hand Pose Model for Hand Expression Recognition (손 표현 인식을 위한 계층적 손 자세 모델)

  • Heo, Gyeongyong;Song, Bok Deuk;Kim, Ji-Hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.10
    • /
    • pp.1323-1329
    • /
    • 2021
  • For hand expression recognition, hand pose recognition based on the static shape of the hand and hand gesture recognition based on the dynamic hand movement are used together. In this paper, we propose a hierarchical hand pose model based on finger position and shape for hand expression recognition. For hand pose recognition, a finger model representing the finger state and a hand pose model using the finger state are hierarchically constructed, which is based on the open source MediaPipe. The finger model is also hierarchically constructed using the bending of one finger and the touch of two fingers. The proposed model can be used for various applications of transmitting information through hands, and its usefulness was verified by applying it to number recognition in sign language. The proposed model is expected to have various applications in the user interface of computers other than sign language recognition.