• Title/Summary/Keyword: gesture control

Search Result 187, Processing Time 0.031 seconds

A Study on Tactile and Gestural Controls of Driver Interfaces for In-Vehicle Systems (차량내 시스템에 대한 접촉 및 제스처 방식의 운전자 인터페이스에 관한 연구)

  • Shim, Ji-Sung;Lee, Sang Hun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.42-50
    • /
    • 2016
  • Traditional tactile controls that include push buttons and rotary switches may cause significant visual and biomechanical distractions if they are located away from the driver's line of sight and hand position, for example, on the central console. Gestural controls, as an alternative to traditional controls, are natural and can reduce visual distractions; however, their types and numbers are limited and have no feedback. To overcome the problems, a driver interface combining gestures and visual feedback with a head-up display has been proposed recently. In this paper, we investigated the effect of this type of interface in terms of driving performance measures. Human-in-the-loop experiments were conducted using a driving simulator with the traditional tactile and the new gesture-based interfaces. The experimental results showed that the new interface caused less visual distractions, better gap control between ego and target vehicles, and better recognition of road conditions comparing to the traditional one.

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.

Verification Process for Stable Human Detection and Tracking (안정적 사람 검출 및 추적을 위한 검증 프로세스)

  • Ahn, Jung-Ho;Choi, Jong-Ho
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.3
    • /
    • pp.202-208
    • /
    • 2011
  • Recently the technologies that control the computer system through human computer interaction(HCI) have been widely studied. Their applications usually involve the methods that locate user's positions via face detection and recognize user's gestures, but face detection performance is not good enough. In case that the applications do not locate user's position stably, user interface performance, such as gesture recognition, is significantly decreased. In this paper we propose a new stable face detection algorithm using skin color detection and cumulative distribution of face detection results, whose effectiveness was verified by experiments. The propsed algorithm can be applicable in the area of human tracking that uses correspondence matrix analysis.

Implementation of communication system using signals originating from facial muscle constructions

  • Kim, EungSoo;Eum, TaeWan
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.2
    • /
    • pp.217-222
    • /
    • 2004
  • A person does communication between each other using language. But, In the case of disabled person, cannot communicate own idea to use writing and gesture. We embodied communication system using the EEG so that disabled person can do communication. After feature extraction of the EEG included facial muscle signals, it is converted the facial muscle into control signal, and then did so that can select character and communicate idea.

An Experimental Research on the Usability of Indirect Control using Finger Gesture Interaction in Three Dimensional Space (3차원 공간에서 손가락 제스쳐 인터랙션을 이용한 간접제어의 사용성에 관한 실험연구)

  • Ham, Kyung Sun;Lee, Dahye;Hong, Hee Jung;Park, Sungjae;Kim, Jinwoo
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.11
    • /
    • pp.519-532
    • /
    • 2014
  • The emerging technologies for the natural computer interaction can give manufacturers new opportunities of product innovation. This paper is the study on a method of human communication about a finger gestures interaction. As technological advance has been so rapid over the last few decades, the utilizing products or services will be soon popular. The purpose of this experiment are as follows; What is the usefulness of gesture interaction? What is the cognitive impact on gesture interaction users. The finger gestures interaction consist of poking, picking and grasping. By measuring each usability in 2D and 3D space, this study shows the effect of finger gestures interaction. The 2D and 3D experimental tool is developed by using LeapMotion technology. As a results, the experiments involved 48 subjects shows that there is no difference in usability between the gestures in 2D space but in 3D space, the meaningful difference has been found. In addition, all gestures express good usability in 2D space rather than 3D space. Especially, there are the attractive interest that using uni-finger is better than multi-fingers.

Development of Hand Recognition Interface for Interactive Digital Signage (인터렉티브 디지털 사이니지를 위한 손 인식 인터페이스 개발)

  • Lee, Jung-Wun;Cha, Kyung-Ae;Ryu, Jeong-Tak
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.22 no.3
    • /
    • pp.1-11
    • /
    • 2017
  • There is a Growing Interest in Motion Recognition for Recognizing Human Motion in Camera Images. As a Result, Researches are Being Actively Conducted to Control Digital Devices with Gestures at a Long Distance. The Interface Using Gesture can be Effectively Used in the Digital Signage Industry Where the Advertisement Effect is Expected to be Exposed to the Public in Various Places. Since the Digital Signage Contents can be Easily Controlled through the Non-contact Hand Operation, it is Possible to Provide the Advertisement Information of Interest to a Large Number of People, Thereby Providing an Opportunity to Lead to Sales. Therefore, we Propose a Digital Signage Content Control System Based on Hand Movement at a Certain Distance, which can be Effectively Used for the Development of Interactive Advertizing Media.

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

Autonomous Mobile Robot Control using the Wearable Devices Based on EMG Signal for detecting fire (EMG 신호 기반의 웨어러블 기기를 통한 화재감지 자율 주행 로봇 제어)

  • Kim, Jin-Woo;Lee, Woo-Young;Yu, Je-Hun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.3
    • /
    • pp.176-181
    • /
    • 2016
  • In this paper, the autonomous mobile robot control system for detecting fire was proposed using the wearable device based on EMG(Electromyogram) signal. Myo armband is used for detecting the user's EMG signal. The gesture was classified after sending the data of EMG signal to a computer using Bluetooth communication. Then the robot named 'uBrain' was implemented to move by received data from Bluetooth communication in our experiment. 'Move front', 'Turn right', 'Turn left', and 'Stop' are controllable commands for the robot. And if the robot cannot receive the Bluetooth signal from a user or if a user wants to change manual mode to autonomous mode, the robot was implemented to be in the autonomous mode. The robot flashes the LED when IR sensor detects the fire during moving.

Visual Touchless User Interface for Window Manipulation (윈도우 제어를 위한 시각적 비접촉 사용자 인터페이스)

  • Kim, Jin-Woo;Jung, Kyung-Boo;Jeong, Seung-Do;Choi, Byung-Uk
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.6
    • /
    • pp.471-478
    • /
    • 2009
  • Recently, researches for user interface are remarkably processed due to the explosive growth of 3-dimensional contents and applications, and the spread class of computer user. This paper proposes a novel method to manipulate windows efficiently using only the intuitive motion of hand. Previous methods have some drawbacks such as burden of expensive device, high complexity of gesture recognition, assistance of additional information using marker, and so on. To improve the defects, we propose a novel visual touchless interface. First, we detect hand region using hue channel in HSV color space to control window using hand. The distance transform method is applied to detect centroid of hand and curvature of hand contour is used to determine position of fingertips. Finally, by using the hand motion information, we recognize hand gesture as one of predefined seven motions. Recognized hand gesture is to be a command to control window. In the proposed method, user can manipulate windows with sense of depth in the real environment because the method adopts stereo camera. Intuitive manipulation is also available because the proposed method supports visual touch for the virtual object, which user want to manipulate, only using simple motions of hand. Finally, the efficiency of the proposed method is verified via an application based on our proposed interface.

AdaBoost-based Gesture Recognition Using Time Interval Window Applied Global and Local Feature Vectors with Mono Camera (모노 카메라 영상기반 시간 간격 윈도우를 이용한 광역 및 지역 특징 벡터 적용 AdaBoost기반 제스처 인식)

  • Hwang, Seung-Jun;Ko, Ha-Yoon;Baek, Joong-Hwan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.3
    • /
    • pp.471-479
    • /
    • 2018
  • Recently, the spread of smart TV based Android iOS Set Top box has become common. This paper propose a new approach to control the TV using gestures away from the era of controlling the TV using remote control. In this paper, the AdaBoost algorithm is applied to gesture recognition by using a mono camera. First, we use Camshift-based Body tracking and estimation algorithm based on Gaussian background removal for body coordinate extraction. Using global and local feature vectors, we recognized gestures with speed change. By tracking the time interval trajectories of hand and wrist, the AdaBoost algorithm with CART algorithm is used to train and classify gestures. The principal component feature vector with high classification success rate is searched using CART algorithm. As a result, 24 optimal feature vectors were found, which showed lower error rate (3.73%) and higher accuracy rate (95.17%) than the existing algorithm.