• Title/Summary/Keyword: Hand Gesture

Search Result 404, Processing Time 0.033 seconds

User-Defined Hand Gestures for Small Cylindrical Displays (소형 원통형 디스플레이를 위한 사용자 정의 핸드 제스처)

  • Kim, Hyoyoung;Kim, Heesun;Lee, Dongeon;Park, Ji-hyung
    • The Journal of the Korea Contents Association
    • /
    • v.17 no.3
    • /
    • pp.74-87
    • /
    • 2017
  • This paper aims to elicit user-defined hand gestures for the small cylindrical displays with flexible displays which has not emerged as a product yet. For this, we first defined the size and functions of a small cylindrical display, and elicited the tasks for operating its functions. Henceforward we implemented the experiment environment which is similar to real cylindrical display usage environment by developing both of a virtual cylindrical display interface and a physical object for operating the virtual cylindrical display. And we showed the results of each task in the virtual cylindrical display to the participants so they could define the hand gestures which are suitable for each task in their opinion. We selected the representative gestures for each task by choosing the gestures of the largest group in each task, and we also calculated agreement scores for each task. Finally we observed mental model of the participants which was applied for eliciting the gestures, based on analyzing the gestures and interview results from the participants.

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

Interface of Interactive Contents using Vision-based Body Gesture Recognition (비전 기반 신체 제스처 인식을 이용한 상호작용 콘텐츠 인터페이스)

  • Park, Jae Wan;Song, Dae Hyun;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.2
    • /
    • pp.40-46
    • /
    • 2012
  • In this paper, we describe interactive contents which is used the result of the inputted interface recognizing vision-based body gesture. Because the content uses the imp which is the common culture as the subject in Asia, we can enjoy it with culture familiarity. And also since the player can use their own gesture to fight with the imp in the game, they are naturally absorbed in the game. And the users can choose the multiple endings of the contents in the end of the scenario. In the part of the gesture recognition, KINECT is used to obtain the three-dimensional coordinates of each joint of the limb to capture the static pose of the actions. The vision-based 3D human pose recognition technology is used to method for convey human gesture in HCI(Human-Computer Interaction). 2D pose model based recognition method recognizes simple 2D human pose in particular environment On the other hand, 3D pose model which describes 3D human body skeletal structure can recognize more complex 3D pose than 2D pose model in because it can use joint angle and shape information of body part Because gestures can be presented through sequential static poses, we recognize the gestures which are configured poses by using HMM In this paper, we describe the interactive content which is used as input interface by using gesture recognition result. So, we can control the contents using only user's gestures naturally. And we intended to improve the immersion and the interest by using the imp who is used real-time interaction with user.

  • PDF

Augmented Reality Game Interface Using Hand Gestures Tracking (사용자 손동작 추적에 기반한 증강현실 게임 인터페이스)

  • Yoon, Jong-Hyun;Park, Jong-Seung
    • Journal of Korea Game Society
    • /
    • v.6 no.2
    • /
    • pp.3-12
    • /
    • 2006
  • Recently, Many 3D augmented reality games that provide strengthened immersive have appeared in the 3D game environment. In this article, we describe a barehanded interaction method based on human hand gestures for augmented reality games. First, feature points are extracted from input video streams. Point features are tracked and motion of moving objects are computed. The shape of the motion trajectories are used to determine whether the motion is intended gestures. A long smooth trajectory toward one of virtual objects or menus is classified as an intended gesture and the corresponding action is invoked. To prove the validity of the proposed method, we implemented two simple augmented reality applications: a gesture-based music player and a virtual basketball game. In the music player, several menu icons are displayed on the top of the screen and an user can activate a menu by hand gestures. In the virtual basketball game, a virtual ball is bouncing in a virtual cube space and the real video stream is shown in the background. An user can hit the virtual ball with his hand gestures. From the experiments for three untrained users, it is shown that the accuracy of menu activation according to the intended gestures is 94% for normal speed gestures and 84% for fast and abrupt gestures.

  • PDF

Three-Dimensional Direction Code Patterns for Hand Gesture Recognition (손동작인식을 위한 3차원 방향 코드 패턴)

  • Park, Jung-Hoo;Kim, Young-Ju
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2013.07a
    • /
    • pp.21-22
    • /
    • 2013
  • 논문에서는 제스처 인식을 하기 위해 필요한 특징 값을 3차원 방향 코드로 구현한 특징 패턴을 검출하는 방법을 제안한다. 검출된 데이터 좌표끼리 직선을 만들고 직선들의 사이각의 합 연산을 이용해서 특징 변곡점을 추출한다. 추출된 변곡점끼리 직선을 생성한 후, 8방향 코드와 깊이 값을 병합시킨 24방향 코드를 맵핑 시켜준다. 맵핑된 방향 코드들을 한 패턴으로 생성한다. 생성된 패턴에서 인식에 불필요한 방향 노이즈를 제거하기 위해 특정 규칙을 적용한 필터링을 적용하여 필터링된 패턴을 추출하게 된다. '배너코드를 이용한 8방향 패턴'과 비교해서 더 효과적인 패턴이 추출됨을 확인하였다.

  • PDF

Proposal of Image Noise Improvement Algorithm for Implementing Hand Gestures

  • Moon, Yu-Sung;Choi, Ung-Se;Kim, Jung-Won
    • Journal of IKEEE
    • /
    • v.23 no.4
    • /
    • pp.1465-1468
    • /
    • 2019
  • The image noise improvement algorithm proposed in this paper extracts the boundary line by using the window of the binarized image to detect the gesture motion. Boundary line blurring is prevented by improving Gaussian noise generated during video output. To improve gesture recognition in low-light environments, an image noise enhancement algorithm has been designed to provide an output image close to the base image. Analyzing the experimental results, we found almost 10% improvement in the results compared to the results of the existing Median filter.

A Design and Implementation of Gesture Recognition System (제스쳐 인식 시스템 설계 및 구현)

  • Kim, Kun-Woo;Kim, Chang-Hyun;Jeon, Chang-Ho;Lee, Won-Joo
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2008.06a
    • /
    • pp.231-235
    • /
    • 2008
  • 컴퓨터 및 주변기기의 성능이 발전함에 따라 영상처리에 대한 관심이 높아지고, 영상으로부터 원하는 정보를 얻기 위한 연구가 활발히 진행되고 있다. 이러한 연구에서 움직임 추적, 특정 사물 추출, 동영상 검색 등으로 정보를 추출하는 과정은 높은 시스템 자원을 요구하기 때문에 멀티태스킹이 어렵다. 따라서 본 논문에서는 시스템 자원의 사용을 최소화하는 제스쳐 인식시스템을 설계하고 구현한다. 이 시스템은 동적테이블 마스킹을 이용하여 노이즈를 제거하고, 가이드라인 인식 방법을 적용하여 손동작 제스쳐를 인식한다, 또한 안면 비율 분할 방법과 음영 측정 방법을 이용하여 눈과 입술의 제스쳐를 인식한다.

  • PDF

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

On-line Korean Sing Language(KSL) Recognition using Fuzzy Min-Max Neural Network and feature Analysis

  • zeungnam Bien;Kim, Jong-Sung
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1995.10b
    • /
    • pp.85-91
    • /
    • 1995
  • This paper presents a system which recognizes the Korean Sign Language(KSL) and translates into normal Korean speech. A sign language is a method of communication for the deaf-mute who uses gestures, especially both hands and fingers. Since the human hands and fingers are not the same in physical dimension, the same form of a gesture produced by two signers with their hands may not produce the same numerical values when obtained through electronic sensors. In this paper, we propose a dynamic gesture recognition method based on feature analysis for efficient classification of hand motions, and on a fuzzy min-max neural network for on-line pattern recognition.

  • PDF

Emergency Signal Detection based on Arm Gesture by Motion Vector Tracking in Face Area

  • Fayyaz, Rabia;Park, Dae Jun;Rhee, Eun Joo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.1
    • /
    • pp.22-28
    • /
    • 2019
  • This paper presents a method for detection of an emergency signal expressed by arm gestures based on motion segmentation and face area detection in the surveillance system. The important indicators of emergency can be arm gestures and voice. We define an emergency signal as the 'Help Me' arm gestures in a rectangle around the face. The 'Help Me' arm gestures are detected by tracking changes in the direction of the horizontal motion vectors of left and right arms. The experimental results show that the proposed method successfully detects 'Help Me' emergency signal for a single person and distinguishes it from other similar arm gestures such as hand waving for 'Bye' and stretching. The proposed method can be used effectively in situations where people can't speak, and there is a language or voice disability.