• Title/Summary/Keyword: Gesture Recognition.

Search Result 558, Processing Time 0.025 seconds

MRF Particle filter-based Multi-Touch Tracking and Gesture Likelihood Estimation (MRF 입자필터 멀티터치 추적 및 제스처 우도 측정)

  • Oh, Chi-Min;Shin, Bok-Suk;Klette, Reinhard;Lee, Chil-Woo
    • Smart Media Journal
    • /
    • v.4 no.1
    • /
    • pp.16-24
    • /
    • 2015
  • In this paper, we propose a method for multi-touch tracking using MRF-based particle filters and gesture likelihood estimation Each touch (of one finger) is considered to be one object. One of frequently occurring issues is the hijacking problem which means that an object tracker can be hijacked by neighboring object. If a predicted particle is close to an adjacent object then the particle's weight should be lowered by analysing the influence of neighboring objects for avoiding hijacking problem. We define a penalty function to lower the weights of those particles. MRF is a graph representation where a node is the location of a target object and an edge describes the adjacent relation of target object. It is easy to utilize MRF as data structure of adjacent objects. Moreover, since MRF graph representation is helpful to analyze multi-touch gestures, we describe how to define gesture likelihoods based on MRF. The experimental results show that the proposed method can avoid the occurrence of hijacking problems and is able to estimate gesture likelihoods with high accuracy.

A Hand Gesture Recognition System using 3D Tracking Volume Restriction Technique (3차원 추적영역 제한 기법을 이용한 손 동작 인식 시스템)

  • Kim, Kyung-Ho;Jung, Da-Un;Lee, Seok-Han;Choi, Jong-Soo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.201-211
    • /
    • 2013
  • In this paper, we propose a hand tracking and gesture recognition system. Our system employs a depth capture device to obtain 3D geometric information of user's bare hand. In particular, we build a flexible tracking volume and restrict the hand tracking area, so that we can avoid diverse problems caused by conventional object detection/tracking systems. The proposed system computes running average of the hand position, and tracking volume is actively adjusted according to the statistical information that is computed on the basis of uncertainty of the user's hand motion in the 3D space. Once the position of user's hand is obtained, then the system attempts to detect stretched fingers to recognize finger gesture of the user's hand. In order to test the proposed framework, we built a NUI system using the proposed technique, and verified that our system presents very stable performance even in the case that multiple objects exist simultaneously in the crowded environment, as well as in the situation that the scene is occluded temporarily. We also verified that our system ensures running speed of 24-30 frames per second throughout the experiments.

A Compensation Algorithm for the Position of User Hands Based on Moving Mean-Shift for Gesture Recognition in HRI System (HRI 시스템에서 제스처 인식을 위한 Moving Mean-Shift 기반 사용자 손 위치 보정 알고리즘)

  • Kim, Tae-Wan;Kwon, Soon-Ryang;Lee, Dong Myung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.5
    • /
    • pp.863-870
    • /
    • 2015
  • A Compensation Algorithm for The Position of the User Hands based on the Moving Mean-Shift ($CAPUH_{MMS}$) in Human Robot Interface (HRI) System running the Kinect sensor is proposed in order to improve the performance of the gesture recognition is proposed in this paper. The average error improvement ratio of the trajectories ($AEIR_{TJ}$) in left-right movements of hands for the $CAPUH_{MMS}$ is compared with other compensation algorithms such as the Compensation Algorithm based on the Compensation Algorithm based on the Kalman Filter ($CA_{KF}$) and the Compensation Algorithm based on Least-Squares Method ($CA_{LSM}$) by the developed realtime performance simulator. As a result, the $AEIR_{TJ}$ in up-down movements of hands of the $CAPUH_{MMS}$ is measured as 19.35%, it is higher value compared with that of the $CA_{KF}$ and the $CA_{LSM}$ as 13.88% and 16.68%, respectively.

Hand Gesture Recognition Using Shape Decomposition (형상 분해를 이용한 손동작 인식)

  • Choi, Junyeong;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2010.11a
    • /
    • pp.223-224
    • /
    • 2010
  • 본 논문에서는 형상 분해(Shape Decomposition)를 이용한 손동작 인식 방법을 제안한다. 형상 분해 방법을 손동작 인식에 적용함으로써 다양한 동작에 대해서 유연한 인식이 가능하며, 기존의 형상 분해 방법을 손 형상 분해에 적합하게 효율적으로 개선함으로써 실시간 연산이 가능하도록 하였다.

  • PDF

Emotion Recognition of Facial Expression using the Hybrid Feature Extraction (혼합형 특징점 추출을 이용한 얼굴 표정의 감성 인식)

  • Byun, Kwang-Sub;Park, Chang-Hyun;Sim, Kwee-Bo
    • Proceedings of the KIEE Conference
    • /
    • 2004.05a
    • /
    • pp.132-134
    • /
    • 2004
  • Emotion recognition between human and human is done compositely using various features that are face, voice, gesture and etc. Among them, it is a face that emotion expression is revealed the most definitely. Human expresses and recognizes a emotion using complex and various features of the face. This paper proposes hybrid feature extraction for emotions recognition from facial expression. Hybrid feature extraction imitates emotion recognition system of human by combination of geometrical feature based extraction and color distributed histogram. That is, it can robustly perform emotion recognition by extracting many features of facial expression.

  • PDF

Implementation of Hand-Gesture-Based Augmented Reality Interface on Mobile Phone (휴대폰 상에서의 손동작 기반 증강현실 인터페이스 구현)

  • Choi, Jun-Yeong;Park, Han-Hoon;Park, Jung-Sik;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.16 no.6
    • /
    • pp.941-950
    • /
    • 2011
  • With the recent advance in the performance of mobile phones, many effective interfaces for them have been proposed. This paper implements a hand-gesture-and-vision-based interface on a mobile phone. This paper assumes natural interaction scenario when user holds a mobile phone in a hand and sees the other hand's palm through mobile phone's camera. Then, a virtual object is rendered on his/her palm and reacts to hand and finger movements. Since the implemented interface is based on hand familiar to humans and does not require any additional sensors or markers, user freely interacts with the virtual object anytime and anywhere without any training. The implemented interface worked at 5 fps on mobile phone (Galaxy S2 having a dual-core processor).

Design and Development of Virtual Reality Exergame using Smart mat and Camera Sensor (스마트매트와 카메라 센서를 이용한 가상현실 체험형 운동게임 시스템 설계 및 구현)

  • Seo, Duck Hee;Park, Kyung Shin;Kim, Dong Keun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.12
    • /
    • pp.2297-2304
    • /
    • 2016
  • In this study, we designed and developed the virtual reality Exergame using the smart mat and the camera sensor for exercises in indoor environments. For detecting the gestures of a upper body of users, the KINECT camera based the gesture recognition algorithm used angles between user's joint information system was adopted, and the smart mat system including a LED equipment and Bluetooth communication module was developed for user's stepping data during the exercises that requires the gestures and stepping of users. Finally, the integrated virtual reality Exergame system was implement along with the Unity 3D engine and different kinds of user' virtual avatar characters with entertainment game contents such as displaying gesture guideline and a scoring function. Therefore, the designed system will useful for elders who need to improve cognitive ability and sense of balance or general users want to improve exercise ability and the indoor circumstances such home or wellness centers.

On-line Motion Control of Avatar Using Hand Gesture Recognition (손 제스터 인식을 이용한 실시간 아바타 자세 제어)

  • Kim, Jong-Sung;Kim, Jung-Bae;Song, Kyung-Joon;Min, Byung-Eui;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.6
    • /
    • pp.52-62
    • /
    • 1999
  • This paper presents a system which recognizes dynamic hand gestures on-line for controlling motion of numan avatar in virtual environment(VF). A dynamic hand gesture is a method of communication between a computer and a human being who uses gestures, especially both hands and fingers. A human avatar consists of 32 degree of freedom(DOF) for natural motion in VE and navigates by 8 pre-defined dynamic hand gestures. Inverse kinematics and dynamic kinematics are applied for real-time motion control of human avatar. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line dynamic hand gesture recognition.

  • PDF