• Title/Summary/Keyword: gesture tracking

Search Result 110, Processing Time 0.024 seconds

Interactive visual knowledge acquisition for hand-gesture recognition (손 제스쳐 인식을 위한 상호작용 시각정보 추출)

  • 양선옥;최형일
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.9
    • /
    • pp.88-96
    • /
    • 1996
  • Computer vision-based gesture recognition systems consist of image segmentation, object tracking and decision. However, it is difficult to segment an object from image for gesture in computer systems because of vaious illuminations and backgrounds. In this paper, we describe a method to learn features for segmentation, which improves the performance of computer vision-based hand-gesture recognition systems. Systems interact with a user to acquire exact training data and segment information according to a predefined plan. System provides some models to the user, takes pictures of the user's response and then analyzes the pictures with models and a prior knowledge. The system sends messages to the user and operates learning module to extract information with the analyzed result.

  • PDF

Gesture based Natural User Interface for e-Training

  • Lim, C.J.;Lee, Nam-Hee;Jeong, Yun-Guen;Heo, Seung-Il
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.577-583
    • /
    • 2012
  • Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.

A Study on the Gesture Recognition Using the Particle Filter Algorithm (Particle Filter를 이용한 제스처 인식 연구)

  • Lee, Yang-Weon;Kim, Chul-Won
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.11
    • /
    • pp.2032-2038
    • /
    • 2006
  • The recognition of human gestures in image sequences is an important and challenging problem that enables a host of human-computer interaction applications. This paper describes a gesture recognition algorithm based on the particle filters, namely CONDENSATION. The particle filter is more efficient than any other tracking algorithm because the tracking mechanism follows Bayesian estimation rule of conditional probability propagation. We used two models for the evaluation of particle Inter and apply the MATLAB for the preprocessing of the image sequence. But we implement the particle filter using the C++ to get the high speed processing. In the experimental results, it is demonstrated that the proposed algorithm prove to be robust in the cluttered environment.

A Study on the Gesture Recognition Based on the Particle Filter Using CONDENSATION Algorithm (CONDENSATION 알고리즘을 이용한 입자필터 기반 동작 인식 연구)

  • Lee, Yang-Weon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.3
    • /
    • pp.584-591
    • /
    • 2007
  • The recognition of human gestures in image sequences is an important and challenging problem that enables a host of human-computer interaction applications. This paper describes a gesture recognition algorithm based on the particle filters, namely CONDENSATION. The particle filter is more efficient than any other tracking algorithm because the tracking mechanism follows Bayesian estimation rule of conditional probability propagation. We used two models for the evaluation of particle filter and apply the MAILAB for the preprocessing of the image sequence. But we implement the particle filter using the C++ to get the high speed processing. In the experimental results, it is demonstrated that the proposed algorithm prove to be robust in the cluttered environment.

Real-time hand tracking and recognition based on structured template matching (구조적 템플렛 매칭에 기반을 둔 실시간 손 추적 및 인식)

  • Kim, Song-Gook;Bae, Ki-Tae;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1037-1043
    • /
    • 2006
  • 본 논문에서는 유비쿼터스 컴퓨팅 오피스 환경에서 가장 직관적인 HCI 수단인 손 제스처를 사용하여 대형 스크린 상의 응용 프로그램들을 쉽게 제어할 수 있는 시스템을 제안한다. 손 제스처는 손 영역의 정보, 손 중심점의 위치 변화값과 손가락 형상을 이용하여 시스템 제어에 필요한 종류들을 미리 정의해 둔다. 먼저 효율적으로 손 영역 획득을 위해 적외선 카메라를 사용하여 연속된 영상을 획득한다. 획득된 영상 프레임으로부터 구조적 템플레이트 매칭 방법을 사용하여 손의 중심(centroid) 및 손가락끝(fingertip)을 검출한다. 인식과정에서는 양손의 Euclidean distance와 손가락 형상 정보를 이용하여 미리 정의된 제스처와 비교하여 인식을 행한다. 본 논문에서 제안한 비전 기반 hand gesture 제어 시스템은 인간과 컴퓨터의 상호작용을 이해하는데 많은 이점을 제공할 수 있다. 실험 결과를 통해 본 논문에서 제안한 방법의 효율성을 입증한다.

  • PDF

AdaBoost-Based Gesture Recognition Using Time Interval Trajectory Features (시간 간격 특징 벡터를 이용한 AdaBoost 기반 제스처 인식)

  • Hwang, Seung-Jun;Ahn, Gwang-Pyo;Park, Seung-Je;Baek, Joong-Hwan
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.2
    • /
    • pp.247-254
    • /
    • 2013
  • The task of 3D gesture recognition for controlling equipments is highly challenging due to the propagation of 3D smart TV recently. In this paper, the AdaBoost algorithm is applied to 3D gesture recognition by using Kinect sensor. By tracking time interval trajectory of hand, wrist and arm by Kinect, AdaBoost algorithm is used to train and classify 3D gesture. Experimental results demonstrate that the proposed method can successfully extract trained gestures from continuous hand, wrist and arm motion in real time.

Investigating Key User Experience Factors for Virtual Reality Interactions

  • Ahn, Junyoung;Choi, Seungho;Lee, Minjae;Kim, Kyungdoh
    • Journal of the Ergonomics Society of Korea
    • /
    • v.36 no.4
    • /
    • pp.267-280
    • /
    • 2017
  • Objective: The aim of this study is to investigate key user experience factors of interactions for Head Mounted Display (HMD) devices in the Virtual Reality Environment (VRE). Background: Virtual reality interaction research has been conducted steadily, while interaction methods and virtual reality devices have improved. Recently, all of the virtual reality devices are head mounted display based ones. Also, HMD-based interaction types include Remote Controller, Head Tracking, and Hand Gesture. However, there is few study on usability evaluation of virtual reality. Especially, the usability of HMD-based virtual reality was not investigated. Therefore, it is necessary to study the usability of HMD-based virtual reality. Method: HMD-based VR devices released recently have only three interaction types, 'Remote Controller', 'Head Tracking', and 'Hand Gesture'. We search 113 types of research to check the user experience factors or evaluation scales by interaction type. Finally, the key user experience factors or relevant evaluation scales are summarized considering the frequency used in the studies. Results: There are various key user experience factors by each interaction type. First, Remote controller's key user experience factors are 'Ease of learning', 'Ease of use', 'Satisfaction', 'Effectiveness', and 'Efficiency'. Also, Head tracking's key user experience factors are 'Sickness', 'Immersion', 'Intuitiveness', 'Stress', 'Fatigue', and 'Ease of learning'. Finally, Hand gesture's key user experience factors are 'Ease of learning', 'Ease of use', 'Feedback', 'Consistent', 'Simple', 'Natural', 'Efficiency', 'Responsiveness', 'Usefulness', 'Intuitiveness', and 'Adaptability'. Conclusion: We identified key user experience factors for each interaction type through literature review. However, we did not consider objective measures because each study adopted different performance factors. Application: The results of this study can be used when evaluating HMD-based interactions in virtual reality in terms of usability.

Verification Process for Stable Human Detection and Tracking (안정적 사람 검출 및 추적을 위한 검증 프로세스)

  • Ahn, Jung-Ho;Choi, Jong-Ho
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.3
    • /
    • pp.202-208
    • /
    • 2011
  • Recently the technologies that control the computer system through human computer interaction(HCI) have been widely studied. Their applications usually involve the methods that locate user's positions via face detection and recognize user's gestures, but face detection performance is not good enough. In case that the applications do not locate user's position stably, user interface performance, such as gesture recognition, is significantly decreased. In this paper we propose a new stable face detection algorithm using skin color detection and cumulative distribution of face detection results, whose effectiveness was verified by experiments. The propsed algorithm can be applicable in the area of human tracking that uses correspondence matrix analysis.

A Prototype Design for a Real-time VR Game with Hand Tracking Using Affordance Elements

  • Yu-Won Jeong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.5
    • /
    • pp.47-53
    • /
    • 2024
  • In this paper, we propose applying interactive technology in virtual environments to enhance interaction and immersion by inducing more natural movements in the gesture recognition process through the concept of affordance. A technique is proposed to recognize gestures most similar to actual hand movements by applying a line segment recognition algorithm, incorporating sampling and normalization processes in the gesture recognition process. This line segment recognition was applied to the drawing of magic circles in the <VR Spell> game implemented in this paper. The experimental method verified the recognition rates for four line segment recognition actions. This paper aims to propose a VR game that pursues greater immersion and fun for the user through real-time hand tracking technology using affordance Elements, applied to immersive content in virtual environments such as VR games.

RealBook: A Tangible Electronic Book Based on the Interface of TouchFace-V (RealBook: TouchFace-V 인터페이스 기반 실감형 전자책)

  • Song, Dae-Hyeon;Bae, Ki-Tae;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.12
    • /
    • pp.551-559
    • /
    • 2013
  • In this paper, we proposed a tangible RealBook based on the interface of TouchFace-V which is able to recognize multi-touch and hand gesture. The TouchFace-V is applied projection technology on a flat surface such as table, without constraint of space. The system's configuration is addressed installation, calibration, and portability issues that are most existing front-projected vision-based tabletop display. It can provide hand touch and gesture applying computer vision by adopting tracking technology without sensor and traditional input device. The RealBook deals with the combination of each advantage of analog sensibility on texts and multimedia effects of e-book. Also, it provides digitally created stories that would differ in experiences and environments with interacting users' choices on the interface of the book. We proposed e-book that is new concept of electronic book; named RealBook, different from existing and TouchFace-V interface, which can provide more direct viewing, natural and intuitive interactions with hand touch and gesture.