• Title/Summary/Keyword: human interface device

Search Result 148, Processing Time 0.024 seconds

A Framework for Designing Closed-loop Hand Gesture Interface Incorporating Compatibility between Human and Monocular Device

  • Lee, Hyun-Soo;Kim, Sang-Ho
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.533-540
    • /
    • 2012
  • Objective: This paper targets a framework of a hand gesture based interface design. Background: While a modeling of contact-based interfaces has focused on users' ergonomic interface designs and real-time technologies, an implementation of a contactless interface needs error-free classifications as an essential prior condition. These trends made many research studies concentrate on the designs of feature vectors, learning models and their tests. Even though there have been remarkable advances in this field, the ignorance of ergonomics and users' cognitions result in several problems including a user's uneasy behaviors. Method: In order to incorporate compatibilities considering users' comfortable behaviors and device's classification abilities simultaneously, classification-oriented gestures are extracted using the suggested human-hand model and closed-loop classification procedures. Out of the extracted gestures, the compatibility-oriented gestures are acquired though human's ergonomic and cognitive experiments. Then, the obtained hand gestures are converted into a series of hand behaviors - Handycon - which is mapped into several functions in a mobile device. Results: This Handycon model guarantees users' easy behavior and helps fast understandings as well as the high classification rate. Conclusion and Application: The suggested framework contributes to develop a hand gesture-based contactless interface model considering compatibilities between human and device. The suggested procedures can be applied effectively into other contactless interface designs.

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho;Park, So-Young;Hong, Hye-Soo;Kim, Nam-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.593-599
    • /
    • 2012
  • Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

Teleoperated Control of a Mobile Robot Using an Exoskeleton-Type Motion Capturing Device Through Wireless Communication (Exoskeleton 형태의 모션 캡쳐 장치를 이용한 이동로봇의 원격 제어)

  • Jeon, Poong-Woo;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.5
    • /
    • pp.434-441
    • /
    • 2004
  • In this paper, an exoskeleton-type motion capturing system is designed and implemented. The device is designed to have 12 degree-of-freedom entirely to represent human arm motions. Forward and inverse kinematics of the device are analyzed to make sure of its singular positions. With the designed model parameters, simulation studies are conducted to verify that the designed motion capturing system is effective to represent human motions within the workspace. As a counterpart of the exoskeleton system, a mobile robot is built to follow human motion restrictively. Experimental studies of teleoperation from the exoskeleton device to control the mobile robot are carried out to show feasible application of wireless man-machine interface.

A Study on the Color Usability of Lumino Haptic Device (루미노 햅틱 디바이스의 색상 사용성 연구)

  • Lee, Sang-Jin;Kim, Byeong-Woo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.13 no.1
    • /
    • pp.21-26
    • /
    • 2012
  • Haptic device is regarded as the human machine interface technology for easier, more accurate, and intuitive operation. The purpose of this study is to define driver's affection on the haptic device in terms of its design factor : the color of haptic lighting as independent factor. This paper is studied to improve the cognitive ability of existing vehicle haptic device used by only a tactile feedback. On the color feedback usability evaluation, the lmino haptic device is used by adding color feedback to the existing vehicle haptic device. The emotional factor that driver has on the haptic device is extracted by the sensibility analysis. As a result, it is possible to suggest the design direction that satisfies the driver.

Development of Interface device with EOG Signal (EOG(Electro-oculogram) 신호를 이용한 Interface 장치 개발)

  • Kim, Su-Jong;Ryu, Ho-Sun;Kim, Young-Chol
    • Proceedings of the KIEE Conference
    • /
    • 2006.07d
    • /
    • pp.1821-1823
    • /
    • 2006
  • This paper presents a development of interface device for electro-oculogram(EOG) signal and it's application to the wireless mouse of wearable PC. The interface device is composed of five bio-electrodes for detecting oculomotor motion, several band-pass filters, instrumentation amplifier and a microprocessor. we have first analyzed impedance characteristics between skin and a bio-electrode. since the impedance highly depends on human face, it's magnitude differs from person. this interface device was applied to develop a wireless mouse for wearable PC, as a Bio Machine Interface(BMI). Where in the prompt on PC monitor is controlled by only EOG signals. this system was implemented in a Head Mount Display(HMD) unit. experimental results show the accuracy of above 90%.

  • PDF

Tension Based 7 DOEs Force Feedback Device: SPIDAR-G

  • Kim, Seahak;Yasuharu Koike;Makoto Sato
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.4 no.1
    • /
    • pp.9-16
    • /
    • 2002
  • In this paper, we intend to demonstrate a new intuitive force-feedback device for advanced VR applications. Force feed-back for the device is tension based and is characterized by 7 degrees of freedom (DOF); 3 DOF for translation, 3 DOF for rotation, and 1 DOF for grasp). The SPIDAR-G (Space Interface Device for Artificial Reality with Grip) will allow users to interact with virtual objects naturally by manipulating two hemispherical grips located in the center of the device frame. We will show how to connect the strings between each vertex of grip and each extremity of the frame in order to achieve force feedback. In addition, methodologies will be discussed for calculating translation, orientation and grasp using the length of 8 strings connected to the motors and encoders on the frame. The SPIDAR-G exhibits smooth force feedback, minimized inertia, no backlash, scalability and safety. Such features are attributed to strategic string arrangement and control that results in stable haptic rendering. The design and control of the SPIDAR-G will be described in detail and the Space Graphic User Interface system based on the proposed SPIDAR-G system will be demonstrated. Experimental results validate the feasibility of the proposed device and reveal its application to virtual reality.

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

A Study on Force-Reflecting Interface using Ultrasonic Motros (초음파모터를 이용한 역감장치에 관한 연구)

  • 강원찬;김대현;김영동
    • Proceedings of the KIPE Conference
    • /
    • 1998.07a
    • /
    • pp.123-128
    • /
    • 1998
  • This paper describes the evaluation of a force-reflecting interface with ultrasonic motors(USMs). The force-reflecting interface allows a human to feel object within virtual environment. To effectively display the mechanical impedance of the human hand we need a haptic device with specific characteristics, such as low inertia, almost zero friction and very high stiffness. USMs have attracted considerable attention as the actuator satisfied these conditions. USMs combine features such as high driving torque at low rotational speed, high holding torque and fast response therefore we studied two degree of freedom force-reflecting haptic system.

  • PDF

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF