• Title/Summary/Keyword: Gesture Control

Search Result 187, Processing Time 0.029 seconds

Hand Shape Classification using Contour Distribution (윤곽 분포를 이용한 이미지 기반의 손모양 인식 기술)

  • Lee, Changmin;Kim, DaeEun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.6
    • /
    • pp.593-598
    • /
    • 2014
  • Hand gesture recognition based on vision is a challenging task in human-robot interaction. The sign language of finger spelling alphabets has been tested as a kind of hand gesture. In this paper, we test hand gesture recognition by detecting the contour shape and orientation of hand with visual image. The method has three stages, the first stage of finding hand component separated from the background image, the second stage of extracting the contour feature over the hand component and the last stage of comparing the feature with the reference features in the database. Here, finger spelling alphabets are used to verify the performance of our system and our method shows good performance to discriminate finger alphabets.

Gesture Input as an Out-of-band Channel

  • Chagnaadorj, Oyuntungalag;Tanaka, Jiro
    • Journal of Information Processing Systems
    • /
    • v.10 no.1
    • /
    • pp.92-102
    • /
    • 2014
  • In recent years, there has been growing interest in secure pairing, which refers to the establishment of a secure communication channel between two mobile devices. There are a number of descriptions of the various types of out-of-band (OOB) channels, through which authentication data can be transferred under a user's control and involvement. However, none have become widely used due to their lack of adaptability to the variety of mobile devices. In this paper, we introduce a new OOB channel, which uses accelerometer-based gesture input. The gesture-based OOB channel is suitable for all kinds of mobile devices, including input/output constraint devices, as the accelerometer is small and incurs only a small computational overhead. We implemented and evaluated the channel using an Apple iPhone handset. The results demonstrate that the channel is viable with completion times and error rates that are comparable with other OOB channels.

Human Gesture Recognition Technology Based on User Experience for Multimedia Contents Control (멀티미디어 콘텐츠 제어를 위한 사용자 경험 기반 동작 인식 기술)

  • Kim, Yun-Sik;Park, Sang-Yun;Ok, Soo-Yol;Lee, Suk-Hwan;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.10
    • /
    • pp.1196-1204
    • /
    • 2012
  • In this paper, a series of algorithms are proposed for controlling different kinds of multimedia contents and realizing interact between human and computer by using single input device. Human gesture recognition based on NUI is presented firstly in my paper. Since the image information we get it from camera is not sensitive for further processing, we transform it to YCbCr color space, and then morphological processing algorithm is used to delete unuseful noise. Boundary Energy and depth information is extracted for hand detection. After we receive the image of hand detection, PCA algorithm is used to recognize hand posture, difference image and moment method are used to detect hand centroid and extract trajectory of hand movement. 8 direction codes are defined for quantifying gesture trajectory, so the symbol value will be affirmed. Furthermore, HMM algorithm is used for hand gesture recognition based on the symbol value. According to series of methods we presented, we can control multimedia contents by using human gesture recognition. Through large numbers of experiments, the algorithms we presented have satisfying performance, hand detection rate is up to 94.25%, gesture recognition rate exceed 92.6%, hand posture recognition rate can achieve 85.86%, and face detection rate is up to 89.58%. According to these experiment results, we can control many kinds of multimedia contents on computer effectively, such as video player, MP3, e-book and so on.

The Development of a Real-Time Hand Gestures Recognition System Using Infrared Images (적외선 영상을 이용한 실시간 손동작 인식 장치 개발)

  • Ji, Seong Cheol;Kang, Sun Woo;Kim, Joon Seek;Joo, Hyonam
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.12
    • /
    • pp.1100-1108
    • /
    • 2015
  • A camera-based real-time hand posture and gesture recognition system is proposed for controlling various devices inside automobiles. It uses an imaging system composed of a camera with a proper filter and an infrared lighting device to acquire images of hand-motion sequences. Several steps of pre-processing algorithms are applied, followed by a background normalization process before segmenting the hand from the background. The hand posture is determined by first separating the fingers from the main body of the hand and then by finding the relative position of the fingers from the center of the hand. The beginning and ending of the hand motion from the sequence of the acquired images are detected using pre-defined motion rules to start the hand gesture recognition. A set of carefully designed features is computed and extracted from the raw sequence and is fed into a decision tree-like decision rule for determining the hand gesture. Many experiments are performed to verify the system. In this paper, we show the performance results from tests on the 550 sequences of hand motion images collected from five different individuals to cover the variations among many users of the system in a real-time environment. Among them, 539 sequences are correctly recognized, showing a recognition rate of 98%.

Intuitive Controller based on G-Sensor for Flying Drone (비행 드론을 위한 G-센서 기반의 직관적 제어기)

  • Shin, Pan-Seop;Kim, Sun-Kyung;Kim, Jung-Min
    • Journal of Digital Convergence
    • /
    • v.12 no.1
    • /
    • pp.319-324
    • /
    • 2014
  • In recent years, high-performance flying drones attract attention for many peoples. In particular, the drone equipped with multi-rotor is expanding its range of utilization in video imaging, aerial rescue, logistics, monitoring, measurement, military field, etc. However, the control function of its controller is very simple. In this study, using a G-sensor mounted on a mobile device, implements an enhanced controller to control flying drones through the intuitive gesture of user. The implemented controller improves the gesture recognition performance using a neural network algorithm.

Kinect-based Motion Recognition Model for the 3D Contents Control (3D 콘텐츠 제어를 위한 키넥트 기반의 동작 인식 모델)

  • Choi, Han Suk
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.1
    • /
    • pp.24-29
    • /
    • 2014
  • This paper proposes a kinect-based human motion recognition model for the 3D contents control after tracking the human body gesture through the camera in the infrared kinect project. The proposed human motion model in this paper computes the distance variation of the body movement from shoulder to right and left hand, wrist, arm, and elbow. The human motion model is classified into the movement directions such as the left movement, right movement, up, down, enlargement, downsizing. and selection. The proposed kinect-based human motion recognition model is very natural and low cost compared to other contact type gesture recognition technologies and device based gesture technologies with the expensive hardware system.

Combining Object Detection and Hand Gesture Recognition for Automatic Lighting System Control

  • Pham, Giao N.;Nguyen, Phong H.;Kwon, Ki-Ryong
    • Journal of Multimedia Information System
    • /
    • v.6 no.4
    • /
    • pp.329-332
    • /
    • 2019
  • Recently, smart lighting systems are the combination between sensors and lights. These systems turn on/off and adjust the brightness of lights based on the motion of object and the brightness of environment. These systems are often applied in places such as buildings, rooms, garages and parking lot. However, these lighting systems are controlled by lighting sensors, motion sensors based on illumination environment and motion detection. In this paper, we propose an automatic lighting control system using one single camera for buildings, rooms and garages. The proposed system is one integration the results of digital image processing as motion detection, hand gesture detection to control and dim the lighting system. The experimental results showed that the proposed system work very well and could consider to apply for automatic lighting spaces.

A Controlled Study of Interactive Exhibit based on Gesture Image Recognition (제스처 영상 인식기반의 인터렉티브 전시용 제어기술 연구)

  • Cha, Jaesang;Kang, Joonsang;Rho, Jung-Kyu;Choi, Jungwon;Koo, Eunja
    • Journal of Satellite, Information and Communications
    • /
    • v.9 no.1
    • /
    • pp.1-5
    • /
    • 2014
  • Recently, building is rapidly develop more intelligently because of the development of industries. And people seek such as comfort, efficiency, and convenience in office environment and the living environment. Also, people were able to use a variety of devices. Smart TV and smart phones were distributed widely so interaction between devices and human has been increase the interest. A various method study for interaction but there are some discomfort and limitations using controller for interaction. In this paper, a user could be easily interaction and control LED through using Kinect and gesture(hand gestures) without controller. we designed interface which is control LED using the joint information of gesture obtained from Kinect. A user could be individually controlled LED through gestures (hand movements) using the implementation of the interface. We expected developed interface would be useful in LED control and various fields.

A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor (적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구)

  • Lee, Kuk-Seon;Oh, Sang-Heon;Jeon, Kuk-Hui;Kang, Seong-Soo;Ryu, Dong-Hee;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.870-878
    • /
    • 2012
  • Recently, very rapid development of computer and sensor technologies induces various kinds of user interface (UI) technologies based on user experience (UX). In this study, we investigate and develop a smart touch projector system technology on the basis of IR sensor and image processing. In the proposed system, a user can control computer by understanding the control events based on gesture of IR pen as an input device. In the IR image, we extract the movement (or gesture) of the devised pen and track it for recognizing gesture pattern. Also, to correct the error between the coordinate of input image sensor and display device (projector), we propose a coordinate correction algorithm to improve the accuracy of operation. Through this system technology as the next generation human-computer interaction, we can control the events of the equipped computer on the projected image screen without manipulating the computer directly.

Remote Control System using Face and Gesture Recognition based on Deep Learning (딥러닝 기반의 얼굴과 제스처 인식을 활용한 원격 제어)

  • Hwang, Kitae;Lee, Jae-Moon;Jung, Inhwan
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.6
    • /
    • pp.115-121
    • /
    • 2020
  • With the spread of IoT technology, various IoT applications using facial recognition are emerging. This paper describes the design and implementation of a remote control system using deep learning-based face recognition and hand gesture recognition. In general, an application system using face recognition consists of a part that takes an image in real time from a camera, a part that recognizes a face from the image, and a part that utilizes the recognized result. Raspberry PI, a single board computer that can be mounted anywhere, has been used to shoot images in real time, and face recognition software has been developed using tensorflow's FaceNet model for server computers and hand gesture recognition software using OpenCV. We classified users into three groups: Known users, Danger users, and Unknown users, and designed and implemented an application that opens automatic door locks only for Known users who have passed both face recognition and hand gestures.