• 제목/요약/키워드: Gesture Control

검색결과 187건 처리시간 0.026초

The Natural Way of Gestures for Interacting with Smart TV

  • Choi, Jin-Hae;Hong, Ji-Young
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.567-575
    • /
    • 2012
  • Objective: The aim of this study is to get an optimal mental model by investigating user's natural behavior for controlling smart TV by mid-air gestures and to identify which factor is most important for controlling behavior. Background: A lot of TV companies are trying to find simple controlling method for complex smart TV. Although plenty of gesture studies proposing they could get possible alternatives to resolve this pain-point, however, there is no fitted gesture work for smart TV market. So it is needed to find optimal gestures for it. Method: (1) Eliciting core control scene by in-house study. (2) Observe and analyse 20 users' natural behavior as types of hand-held devices and control scene. We also made taxonomies for gestures. Results: Users' are trying to do more manipulative gestures than symbolic gestures when they try to continuous control. Conclusion: The most natural way to control smart TV on the remote with gestures is give user a mental model grabbing and manipulating virtual objects in the mid-air. Application: The results of this work might help to make gesture interaction guidelines for smart TV.

손 제스처 기반의 애완용 로봇 제어 (Hand gesture based a pet robot control)

  • 박세현;김태의;권경수
    • 한국산업정보학회논문지
    • /
    • 제13권4호
    • /
    • pp.145-154
    • /
    • 2008
  • 본 논문에서는 애완용 로봇에 장착된 카메라로부터 획득된 연속 영상에서 사용자의 손 제스처를 인식하여 로봇을 제어하는 시스템을 제안한다. 제안된 시스템은 손 검출, 특징 추출, 제스처 인식 로봇 제어의 4단계로 구성된다. 먼저 카메라로부터 입력된 영상에서 HSI 색상공간에 정의된 피부색 모델과 연결성분 분석을 이용하여 손 영역을 검출한다. 다음은 연속 영상에서 손 영역의 모양과 움직임에 따른 특징을 추출한다. 이때 의미 있는 제스처의 구분을 위해 손의 모양을 고려한다. 그 후에 손의 움직임에 의해 양자화된 심볼들을 입력으로 하는 은닉 마르코프 모델을 이용하여 손 제스처는 인식된다. 마지막으로 인식된 제스처에 대응하는 명령에 따라 애완용 로봇이 동작하게 된다. 애완용 로봇을 제어하기 위한 명령으로 앉아, 일어서, 엎드려, 악수 등의 제스처를 정의하였다. 실험결과로 제안한 시스템을 이용하여 사용자가 제스처로 애완용 로봇을 제어 할 수 있음을 보였다.

  • PDF

지능형 로봇 구동을 위한 제스처 인식 기술 동향 (Survey: Gesture Recognition Techniques for Intelligent Robot)

  • 오재용;이칠우
    • 제어로봇시스템학회논문지
    • /
    • 제10권9호
    • /
    • pp.771-778
    • /
    • 2004
  • Recently, various applications of robot system become more popular in accordance with rapid development of computer hardware/software, artificial intelligence, and automatic control technology. Formerly robots mainly have been used in industrial field, however, nowadays it is said that the robot will do an important role in the home service application. To make the robot more useful, we require further researches on implementation of natural communication method between the human and the robot system, and autonomous behavior generation. The gesture recognition technique is one of the most convenient methods for natural human-robot interaction, so it is to be solved for implementation of intelligent robot system. In this paper, we describe the state-of-the-art of advanced gesture recognition technologies for intelligent robots according to three methods; sensor based method, feature based method, appearance based method, and 3D model based method. And we also discuss some problems and real applications in the research field.

Leap Motion 시스템을 이용한 손동작 인식기반 제어 인터페이스 기술 연구 (A new study on hand gesture recognition algorithm using leap motion system)

  • 남재현;양승훈;허웅;김병규
    • 한국멀티미디어학회논문지
    • /
    • 제17권11호
    • /
    • pp.1263-1269
    • /
    • 2014
  • As rapid development of new hardware control interface technology, new concepts have been being proposed and emerged. In this paper, a new approach based on leap motion system is proposed. While we employ a position information from sensor, the hand gesture recognition is suggested with the pre-defined patterns. To do this, we design a recognition algorithm with hand gesture and finger patterns. We apply the proposed scheme to 3-dimensional avatar controling and editing software tool for making animation in the cyber space as a representative application. This proposed algorithm can be used to control computer systems in medical treatment, game, education and other various areas.

손의 추적과 제스쳐 인식에 의한 슬라이드 제어 (Controlling Slides using Hand tracking and Gesture Recognition)

  • ;이은주
    • 한국정보처리학회:학술대회논문집
    • /
    • 한국정보처리학회 2012년도 춘계학술발표대회
    • /
    • pp.436-439
    • /
    • 2012
  • The work is to the control the desktop Computers based on hand gesture recognition. This paper is worked en real time tracking and recognizes the hand gesture for controlling the slides based on hand direction such as right and left using a real time camera.

The Effects of Visual Stimulation and Body Gesture on Language Learning Achievement and Course Interest

  • CHOI, Dongyeon;KIM, Minjeong
    • Educational Technology International
    • /
    • 제16권2호
    • /
    • pp.141-166
    • /
    • 2015
  • The purpose of this study was to examine the effects of using visual stimulation and gesture, namely embodied language learning, on learning achievement and learner's course interest in the EFL classroom. To investigate the effectiveness of the proposed purpose, thirty two third-grade elementary school students participated and were assigned into four English learning class conditions (i.e., using animated graphic and gestures condition, using only animated graphic condition, using still pictures and gesture condition, and control condition). The research questions for this study are addressed below: (1) What differences are there in post and delayed learning achievement between imitating gesture group and non-imitating one and between animated graphic group and still picture one? (2) What differences are there in course interest between imitating gesture group and non-imitating one and between animated graphic group and still picture one? The Embodiment-based English learning system for this study was designed by using Microsoft's Kinect sensing devices. The results of this study revealed that students of imitating gesture group memorized and retained better words and sentence structure than those of the other groups. As for learner's course interest measurement, imitating gesture group showed a highly positive response to attention, relevance, and satisfaction for curriculum and using animated graphic influenced satisfaction as well. This finding can be attributed to the embodied cognition, which proposes that the body and the mind are inseparable in the constitution of cognition and thus students using visual simulation and imitating related gesture regard the embodied language learning approach more satisfactory and acceptable than the conventional ones.

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho;Park, So-Young;Hong, Hye-Soo;Kim, Nam-Hun
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.593-599
    • /
    • 2012
  • Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

HandButton: Gesture Recognition of Transceiver-free Object by Using Wireless Networks

  • Zhang, Dian;Zheng, Weiling
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제10권2호
    • /
    • pp.787-806
    • /
    • 2016
  • Traditional radio-based gesture recognition approaches usually require the target to carry a device (e.g., an EMG sensor or an accelerometer sensor). However, such requirement cannot be satisfied in many applications. For example, in smart home, users want to control the light on/off by some specific hand gesture, without finding and pressing the button especially in dark area. They will not carry any device in this scenario. To overcome this drawback, in this paper, we propose three algorithms able to recognize the target gesture (mainly the human hand gesture) without carrying any device, based on just Radio Signal Strength Indicator (RSSI). Our platform utilizes only 6 telosB sensor nodes with a very easy deployment. Experiment results show that the successful recognition radio can reach around 80% in our system.

사용자 손 제스처 인식 기반 입체 영상 제어 시스템 설계 및 구현 (Design and Implementation of a Stereoscopic Image Control System based on User Hand Gesture Recognition)

  • 송복득;이승환;최홍규;김성훈
    • 한국정보통신학회논문지
    • /
    • 제26권3호
    • /
    • pp.396-402
    • /
    • 2022
  • 영상 미디어를 위한 사용자 인터랙션은 다양한 형태로 개발되고 있으며, 특히, 인간의 제스처를 활용한 인터랙션이 활발히 연구되고 있다. 그 중에, 손 제스처 인식의 경우 3D Hand Model을 기반으로 실감 미디어 분야에서 휴먼 인터페이스로 활용되고 있다. 손 제스처 인식을 기반으로 한 인터페이스의 활용은 사용자가 미디어 매체에 보다 쉽고 편리하게 접근할 수 있도록 도와준다. 이러한 손 제스처 인식을 활용한 사용자 인터랙션은 컴퓨터 환경 제약 없이 빠르고 정확한 손 제스처 인식 기술을 적용하여 영상을 감상할 수 있어야 한다. 본 논문은 오픈 소스인 미디어 파이프 프레임워크와 머신러닝의 k-NN(K-Nearest Neighbor)을 활용하여 빠르고 정확한 사용자 손 제스처 인식 알고리즘을 제안한다. 그리고 컴퓨터 환경 제약을 최소화하기 위하여 인터넷 서비스가 가능한 웹 서비스 환경 및 가상 환경인 도커 컨테이너를 활용하여 사용자 손 제스처 인식 기반의 입체 영상 제어 시스템을 설계하고 구현한다.

제스처 및 음성 인식을 이용한 윈도우 시스템 제어에 관한 연구 (Study about Windows System Control Using Gesture and Speech Recognition)

  • 김주홍;진성일이남호이용범
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 1998년도 추계종합학술대회 논문집
    • /
    • pp.1289-1292
    • /
    • 1998
  • HCI(human computer interface) technologies have been often implemented using mouse, keyboard and joystick. Because mouse and keyboard are used only in limited situation, More natural HCI methods such as speech based method and gesture based method recently attract wide attention. In this paper, we present multi-modal input system to control Windows system for practical use of multi-media computer. Our multi-modal input system consists of three parts. First one is virtual-hand mouse part. This part is to replace mouse control with a set of gestures. Second one is Windows control system using speech recognition. Third one is Windows control system using gesture recognition. We introduce neural network and HMM methods to recognize speeches and gestures. The results of three parts interface directly to CPU and through Windows.

  • PDF