• Title/Summary/Keyword: Gesture based interface

Search Result 168, Processing Time 0.024 seconds

MPEG-U-based Advanced User Interaction Interface Using Hand Posture Recognition

  • Han, Gukhee;Choi, Haechul
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.4
    • /
    • pp.267-273
    • /
    • 2016
  • Hand posture recognition is an important technique to enable a natural and familiar interface in the human-computer interaction (HCI) field. This paper introduces a hand posture recognition method using a depth camera. Moreover, the hand posture recognition method is incorporated with the Moving Picture Experts Group Rich Media User Interface (MPEG-U) Advanced User Interaction (AUI) Interface (MPEG-U part 2), which can provide a natural interface on a variety of devices. The proposed method initially detects positions and lengths of all fingers opened, and then recognizes the hand posture from the pose of one or two hands, as well as the number of fingers folded when a user presents a gesture representing a pattern in the AUI data format specified in MPEG-U part 2. The AUI interface represents a user's hand posture in the compliant MPEG-U schema structure. Experimental results demonstrate the performance of the hand posture recognition system and verified that the AUI interface is compatible with the MPEG-U standard.

Hand Gesture Sequence Recognition using Morphological Chain Code Edge Vector (형태론적 체인코드 에지벡터를 이용한 핸드 제스처 시퀀스 인식)

  • Lee Kang-Ho;Choi Jong-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.9 no.4 s.32
    • /
    • pp.85-91
    • /
    • 2004
  • The use of gestures provides an attractive alternate to cumbersome interface devices for human-computer interaction. This has motivated a very active research area concerned with computer vision-based analysis and interpretation of hand gestures The most important issues in gesture recognition are the simplification of algorithm and the reduction of processing time. The mathematical morphology based on geometrical set theory is best used to perform the processing. The key idea of proposed algorithm is to track a trajectory of center points in primitive elements extracted by morphological shape decomposition. The trajectory of morphological center points includes the information on shape orientation. Based on this characteristic we proposed the morphological gesture sequence recognition algorithm using feature vectors calculated to the trajectory of morphological center points. Through the experiment, we demonstrated the efficiency of proposed algorithm.

  • PDF

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment (가상현실 환경에서 3D 가상객체 조작을 위한 인터페이스와 인터랙션 비교 연구)

  • Park, Kyeong-Beom;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.20-30
    • /
    • 2016
  • Recently immersive virtual reality (VR) becomes popular due to the advanced development of I/O interfaces and related SWs for effectively constructing VR environments. In particular, natural and intuitive manipulation of 3D virtual objects is still considered as one of the most important user interaction issues. This paper presents a comparative study on the manipulation and interaction of 3D virtual objects using different interfaces and interactions in three VR environments. The comparative study includes both quantitative and qualitative aspects. Three different experimental setups are 1) typical desktop-based VR using mouse and keyboard, 2) hand gesture-supported desktop VR using a Leap Motion sensor, and 3) immersive VR by wearing an HMD with hand gesture interaction using a Leap Motion sensor. In the desktop VR with hand gestures, the Leap Motion sensor is put on the desk. On the other hand, in the immersive VR, the sensor is mounted on the HMD so that the user can manipulate virtual objects in the front of the HMD. For the quantitative analysis, a task completion time and success rate were measured. Experimental tasks require complex 3D transformation such as simultaneous 3D translation and 3D rotation. For the qualitative analysis, various factors relating to user experience such as ease of use, natural interaction, and stressfulness were evaluated. The qualitative and quantitative analyses show that the immersive VR with the natural hand gesture provides more intuitive and natural interactions, supports fast and effective performance on task completion, but causes stressful condition.

Gesture Recognition using Global and Partial Feature Information (전역 및 부분 특징 정보를 이용한 제스처 인식)

  • Lee, Yong-Jae;Lee, Chil-Woo
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.8
    • /
    • pp.759-768
    • /
    • 2005
  • This paper describes an algorithm that can recognize gestures constructing subspace gesture symbols with hybrid feature information. The previous popular methods based on geometric feature and appearance have resulted in ambiguous output in case of recognizing between similar gesture because they use just the Position information of the hands, feet or bodily shape features. However, our proposed method can classify not only recognition of motion but also similar gestures by the partial feature information presenting which parts of body move and the global feature information including 2-dimensional bodily motion. And this method which is a simple and robust recognition algorithm can be applied in various application such surveillance system and intelligent interface systems.

Android Platform based Gesture Recognition using Smart Phone Sensor Data (안드로이드 플랫폼기반 스마트폰 센서 정보를 활용한 모션 제스처 인식)

  • Lee, Yong Cheol;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.4
    • /
    • pp.18-26
    • /
    • 2012
  • The increase of the number of smartphone applications has enforced the importance of new user interface emergence and has raised the interest of research in the convergence of multiple sensors. In this paper, we propose a method for the convergence of acceleration, magnetic and gyro sensors to recognize the gesture from motion of user smartphone. The proposed method first obtain the 3D orientation of smartphone and recognize the gesture of hand motion by using HMM(Hidden Markov Model). The proposed method for the representation for 3D orientation of smartphone in spherical coordinate was used for quantization of smartphone orientation to be more sensitive in rotation axis. The experimental result shows that the success rate of our method is 93%.

  • PDF

An Outlook for Interaction Experience in Next-generation Television

  • Kim, Sung-Woo
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.557-565
    • /
    • 2012
  • Objective: This paper focuses on the new trend of applying NUI(natural user interface) such as gesture interaction into television and investigates on the design improvement needed in application. The intention is to find better design direction of NUI on television context, which will contribute to making new features and behavioral changes occurring in next-generation television more practically usable and meaningful use experience elements. Background: Traditional television is rapidly evolving into next-generation television thanks to the influence of "smartness" from mobile domain. A number of new features and behavioral changes occurred from such evolution are on their way to be characterized as the new experience elements of next-generation television. Method: A series of expert review by television UX professionals based on AHP (Analytic Hierarchy Process) was conducted to check on the "relative appropriateness" of applying gesture interaction to a number of selected television user experience scenarios. Conclusion: It is critical not to indiscriminately apply new interaction techniques like gesture into television. It may be effective in demonstrating new technology but generally results in poor user experience. It is imperative to conduct consistent validation of its practical appropriateness in real context. Application: The research will be helpful in applying gesture interaction in next-generation television to bring optimal user experience in.

Human-Object Interaction Framework Using RGB-D Camera (RGB-D 카메라를 사용한 사용자-실사물 상호작용 프레임워크)

  • Baeka, Yong-Hwan;Lim, Changmin;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.21 no.1
    • /
    • pp.11-23
    • /
    • 2016
  • Recent days, touch interaction interface is the most widely used interaction interface to communicate with digital devices. Because of its usability, touch technology is applied almost everywhere from watch to advertising boards and it is growing much bigger. However, this technology has a critical weakness. Normally, touch input device needs a contact surface with touch sensors embedded in it. Thus, touch interaction through general objects like books or documents are still unavailable. In this paper, a human-object interaction framework based on RGB-D camera is proposed to overcome those limitation. The proposed framework can deal with occluded situations like hovering the hand on top of the object and also moving objects by hand. In such situations object recognition algorithm and hand gesture algorithm may fail to recognize. However, our framework makes it possible to handle complicated circumstances without performance loss. The framework calculates the status of the object with fast and robust object recognition algorithm to determine whether it is an object or a human hand. Then, the hand gesture recognition algorithm controls the context of each object by gestures almost simultaneously.

Implementation of DID interface using gesture recognition (제스쳐 인식을 이용한 DID 인터페이스 구현)

  • Lee, Sang-Hun;Kim, Dae-Jin;Choi, Hong-Sub
    • Journal of Digital Contents Society
    • /
    • v.13 no.3
    • /
    • pp.343-352
    • /
    • 2012
  • In this paper, we implemented a touchless interface for DID(Digital Information Display) system using gesture recognition technique which includes both hand motion and hand shape recognition. Especially this touchless interface without extra attachments gives user both easier usage and spatial convenience. For hand motion recognition, two hand-motion's parameters such as a slope and a velocity were measured as a direction-based recognition way. And extraction of hand area image utilizing YCbCr color model and several image processing methods were adopted to recognize a hand shape recognition. These recognition methods are combined to generate various commands, such as, next-page, previous-page, screen-up, screen-down and mouse -click in oder to control DID system. Finally, experimental results showed the performance of 93% command recognition rate which is enough to confirm the possible application to commercial products.

Motion-Understanding Cell Phones for Intelligent User Interaction and Entertainment (지능형 UI와 Entertainment를 위한 동작 이해 휴대기기)

  • Cho, Sung-Jung;Choi, Eun-Seok;Bang, Won-Chul;Yang, Jing;Cho, Joon-Kee;Ki, Eun-Kwang;Sohn, Jun-Il;Kim, Dong-Yoon;Kim, Sang-Ryong
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.684-691
    • /
    • 2006
  • As many functionalities such as cameras and MP3 players are converged to mobile phones, more intuitive and interesting interaction methods are essential. In this paper, we present applications and their enabling technologies for gesture interactive cell phones. They employ gesture recognition and real-time shake detection algorithm for supporting motion-based user interface and entertainment applications respectively. The gesture recognition algorithm classifies users' movement into one of predefined gestures by modeling basic components of acceleration signals and their relationships. The recognition performance is further enhanced by discriminating frequently confusing classes with support vector machines. The shake detection algorithm detects in real time the exact motion moment when the phone is shaken significantly by utilizing variance and mean of acceleration signals. The gesture interaction algorithms show reliable performance for commercialization; with 100 novice users, the average recognition rate was 96.9% on 11 gestures (digits 1-9, O, X) and users' movements were detected in real time. We have applied the motion understanding technologies to Samsung cell phones in Korean, American, Chinese and European markets since May 2005.

  • PDF