• Title/Summary/Keyword: Gesture based interface

Search Result 168, Processing Time 0.031 seconds

Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents (실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어)

  • Kim, Kwang Jin;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.8 no.1
    • /
    • pp.51-58
    • /
    • 2019
  • This paper describes a gesture application to controlling the special effects of physical devices for 4D contents using the PWM (Pulse Width Modulation) method. The user operation recognized by the infrared sensor is interpreted as a command for 3D content control, several of which manipulate the device that generates the special effect to display the physical stimulus to the user. With the content controlled under the NUI (Natural User Interface) technique, the user can be directly put into an immersion experience, which leads to provision of the higher degree of interest and attention. In order to measure the efficiency of the proposed method, we implemented a PWM-based real-time linear control system that manages the parameters of the motion recognition and animation controller using the infrared sensor and transmits the event.

A Research for Interface Based on EMG Pattern Combinations of Commercial Gesture Controller (상용 제스처 컨트롤러의 근전도 패턴 조합에 따른 인터페이스 연구)

  • Kim, Ki-Chang;Kang, Min-Sung;Ji, Chang-Uk;Ha, Ji-Woo;Sun, Dong-Ik;Xue, Gang;Shin, Kyoo-Sik
    • Journal of Engineering Education Research
    • /
    • v.19 no.1
    • /
    • pp.31-36
    • /
    • 2016
  • These days, ICT-related products are pouring out due to development of mobile technology and increase of smart phones. Among the ICT-related products, wearable devices are being spotlighted with the advent of hyper-connected society. In this paper, a body-attached type wearable device using EMG(electromyography) sensors is studied. The research field of EMG sensors is divided into two parts. One is medical area and another is control device area. This study corresponds to the latter that is a method of transmitting user's manipulation intention to robots, games or computers through the measurement of EMG. We used commercial device MYO developed by Thalmic Labs in Canada and matched up EMG of arm muscles with gesture controller. In the experiment part, first of all, various arm motions for controlling devices are defined. Finally, we drew several distinguishing kinds of motions through analysis of the EMG signals and substituted a joystick with the motions.

A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor (적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구)

  • Lee, Kuk-Seon;Oh, Sang-Heon;Jeon, Kuk-Hui;Kang, Seong-Soo;Ryu, Dong-Hee;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.870-878
    • /
    • 2012
  • Recently, very rapid development of computer and sensor technologies induces various kinds of user interface (UI) technologies based on user experience (UX). In this study, we investigate and develop a smart touch projector system technology on the basis of IR sensor and image processing. In the proposed system, a user can control computer by understanding the control events based on gesture of IR pen as an input device. In the IR image, we extract the movement (or gesture) of the devised pen and track it for recognizing gesture pattern. Also, to correct the error between the coordinate of input image sensor and display device (projector), we propose a coordinate correction algorithm to improve the accuracy of operation. Through this system technology as the next generation human-computer interaction, we can control the events of the equipped computer on the projected image screen without manipulating the computer directly.

Vision-based hand Gesture Detection and Tracking System (비전 기반의 손동작 검출 및 추적 시스템)

  • Park Ho-Sik;Bae Cheol-soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1175-1180
    • /
    • 2005
  • We present a vision-based hand gesture detection and tracking system. Most conventional hand gesture recognition systems utilize a simpler method for hand detection such as background subtractions with assumed static observation conditions and those methods are not robust against camera motions, illumination changes, and so on. Therefore, we propose a statistical method to recognize and detect hand regions in images using geometrical structures. Also, Our hand tracking system employs multiple cameras to reduce occlusion problems and non-synchronous multiple observations enhance system scalability. In this experiment, the proposed method has recognition rate of $99.28\%$ that shows more improved $3.91\%$ than the conventional appearance method.

A Study on Tactile and Gestural Controls of Driver Interfaces for In-Vehicle Systems (차량내 시스템에 대한 접촉 및 제스처 방식의 운전자 인터페이스에 관한 연구)

  • Shim, Ji-Sung;Lee, Sang Hun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.42-50
    • /
    • 2016
  • Traditional tactile controls that include push buttons and rotary switches may cause significant visual and biomechanical distractions if they are located away from the driver's line of sight and hand position, for example, on the central console. Gestural controls, as an alternative to traditional controls, are natural and can reduce visual distractions; however, their types and numbers are limited and have no feedback. To overcome the problems, a driver interface combining gestures and visual feedback with a head-up display has been proposed recently. In this paper, we investigated the effect of this type of interface in terms of driving performance measures. Human-in-the-loop experiments were conducted using a driving simulator with the traditional tactile and the new gesture-based interfaces. The experimental results showed that the new interface caused less visual distractions, better gap control between ego and target vehicles, and better recognition of road conditions comparing to the traditional one.

HAND GESTURE INTERFACE FOR WEARABLE PC

  • Nishihara, Isao;Nakano, Shizuo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.664-667
    • /
    • 2009
  • There is strong demand to create wearable PC systems that can support the user outdoors. When we are outdoors, our movement makes it impossible to use traditional input devices such as keyboards and mice. We propose a hand gesture interface based on image processing to operate wearable PCs. The semi-transparent PC screen is displayed on the head mount display (HMD), and the user makes hand gestures to select icons on the screen. The user's hand is extracted from the images captured by a color camera mounted above the HMD. Since skin color can vary widely due to outdoor lighting effects, a key problem is accurately discrimination the hand from the background. The proposed method does not assume any fixed skin color space. First, the image is divided into blocks and blocks with similar average color are linked. Contiguous regions are then subjected to hand recognition. Blocks on the edges of the hand region are subdivided for more accurate finger discrimination. A change in hand shape is recognized as hand movement. Our current input interface associates a hand grasp with a mouse click. Tests on a prototype system confirm that the proposed method recognizes hand gestures accurately at high speed. We intend to develop a wider range of recognizable gestures.

  • PDF

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.6
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.

A Compensation Algorithm for the Position of User Hands Based on Moving Mean-Shift for Gesture Recognition in HRI System (HRI 시스템에서 제스처 인식을 위한 Moving Mean-Shift 기반 사용자 손 위치 보정 알고리즘)

  • Kim, Tae-Wan;Kwon, Soon-Ryang;Lee, Dong Myung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.5
    • /
    • pp.863-870
    • /
    • 2015
  • A Compensation Algorithm for The Position of the User Hands based on the Moving Mean-Shift ($CAPUH_{MMS}$) in Human Robot Interface (HRI) System running the Kinect sensor is proposed in order to improve the performance of the gesture recognition is proposed in this paper. The average error improvement ratio of the trajectories ($AEIR_{TJ}$) in left-right movements of hands for the $CAPUH_{MMS}$ is compared with other compensation algorithms such as the Compensation Algorithm based on the Compensation Algorithm based on the Kalman Filter ($CA_{KF}$) and the Compensation Algorithm based on Least-Squares Method ($CA_{LSM}$) by the developed realtime performance simulator. As a result, the $AEIR_{TJ}$ in up-down movements of hands of the $CAPUH_{MMS}$ is measured as 19.35%, it is higher value compared with that of the $CA_{KF}$ and the $CA_{LSM}$ as 13.88% and 16.68%, respectively.

Kinect-based Motion Recognition Model for the 3D Contents Control (3D 콘텐츠 제어를 위한 키넥트 기반의 동작 인식 모델)

  • Choi, Han Suk
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.1
    • /
    • pp.24-29
    • /
    • 2014
  • This paper proposes a kinect-based human motion recognition model for the 3D contents control after tracking the human body gesture through the camera in the infrared kinect project. The proposed human motion model in this paper computes the distance variation of the body movement from shoulder to right and left hand, wrist, arm, and elbow. The human motion model is classified into the movement directions such as the left movement, right movement, up, down, enlargement, downsizing. and selection. The proposed kinect-based human motion recognition model is very natural and low cost compared to other contact type gesture recognition technologies and device based gesture technologies with the expensive hardware system.