• Title/Summary/Keyword: Eye-controlled Human/Computer Interface

Search Result 8, Processing Time 0.025 seconds

An Efficient Camera Calibration Method for Head Pose Tracking (머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구)

  • Park, Gyeong-Su;Im, Chang-Ju;Lee, Gyeong-Tae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

Eye as a Human/Computer Interface Device (눈으로 조종하는 인간/컴퓨터 인터페이스)

  • 박경수;이경태
    • Proceedings of the ESK Conference
    • /
    • 1996.04a
    • /
    • pp.36-47
    • /
    • 1996
  • By integrating the eye head-position monitioring devices, the present authors developed an eye-controlled human/computer interface based on the line-of-sight and an intentional blink to invoke commands. Also modified was an existing calibration method to reduce the visual angle between the target center and the intersection point of the derived line-of-sight. This modified calibration method allowed 108 or more command blocks to be displayed on the 14 inch monitor with the target acquisition probability(hit rate) of 98% when viewed at the distance of 500 mm apart. An active triggering method using an intentional blink was proposed and was shown to be a feasible and efficient alternative to invoke commands with total triggering time of 0.8 sec or less. The system could be used by the normal people as well as the handicapped individuals as a new human/computer interface.

  • PDF

A Criteria of Triggering by the Intentional Double Blinks (의도적 이중 눈 깜빡임을 이용한 명령 실행시의 기준에 관한 연구)

  • Lee, Gyeong-Tae;Ban, Yeong-Hwan;Jang, Pil-Sik;Park, Gyeong-Su
    • Journal of the Ergonomics Society of Korea
    • /
    • v.18 no.3
    • /
    • pp.171-178
    • /
    • 1999
  • Several studies of eye-slaved nonverbal communicators have been performed recently. By integrating the eye and head-position monitoring devices, the present authors had developed an eye-controlled human/computer interface based on the line-of-sight and an intentional blink to invoke commands in the preceeding study. As a successive study, this paper examines the characteristics of performing the intentional double blinks experimentally and proposes the double blinks as an alternative triggering method. The applications may extend to several domains such as rehabilitation and virtual reality system with head mounted display.

  • PDF

Head tracking system using image processing (영상처리를 이용한 머리의 움직임 추적 시스템)

  • 박경수;임창주;반영환;장필식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.3
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Facial Feature Tracking and Head Orientation-based Gaze Tracking

  • Ko, Jong-Gook;Kim, Kyungnam;Park, Seung-Ho;Kim, Jin-Young;Kim, Ki-Jung;Kim, Jung-Nyo
    • Proceedings of the IEEK Conference
    • /
    • 2000.07a
    • /
    • pp.11-14
    • /
    • 2000
  • In this paper, we propose a fast and practical head pose estimation scheme fur eye-head controlled human computer interface with non-constrained background. The method we propose uses complete graph matching from thresholded images and the two blocks showing the greatest similarity are selected as eyes, we also locate mouth and nostrils in turn using the eye location information and size information. The average computing time of the image(360*240) is within 0.2(sec) and we employ template matching method using angles between facial features for head pose estimation. It has been tested on several sequential facial images with different illuminating conditions and varied head poses, It returned quite a satisfactory performance in both speed and accuracy.

  • PDF

Driving with an Adaptive Cruise Control System

  • Nam, Hyoung-Kwon;Lee, Woon-Sung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.717-722
    • /
    • 2003
  • A driving simulator is a computer-controlled tool to study an interface between a driver and vehicle response by enabling the driver to participate in judging vehicle characteristics. Using the driving simulator, human factor study, vehicle system development and other research can be effectively done under controllable, reproducible and non-dangerous conditions. An Adaptive Cruise Control (ACC) system is generally regarded as a system that can be achieved in the near future without the demanding infrastructure components and technologies. ACC system is an automatic vehicle following system with no human engagement in the longitudinal vehicle direction. And the influence of the driver is substantial in developing the system. Driving characteristic is very different according to the accident riskiness, gender, age and so on. In this research, experiments have been carried out to investigate driving characteristics with the ACC system, using a driving simulator. Participants are 21 male and 19 female. Driving characteristics such as preferred headway-time, lane keeping ability, eye direction, and head movement have been observed and compared between the driving with ACC and the driving without ACC.

  • PDF

Evaluation of the Head Mouse System using Gyro-and Opto-Sensors (각속도 및 광센서를 이용한 헤드 마우스의 평가)

  • Park, Min-Je;Kim, Soo-Chan
    • Journal of the HCI Society of Korea
    • /
    • v.5 no.2
    • /
    • pp.1-6
    • /
    • 2010
  • In this research, we designed the head mouse system for disabled and gamers, a mouse controller which can be controlled by head movements and eye blinks only, and compared its performance with other regular mouse controller systems. The head mouse was moved by a gyro-sensor, which can measure an angular rotation of a head movement, and the eye blink was used as a clicking event of the mouse system. Accumulated errors caused by integral, which was a problem that previous head mouse system had, were removed periodically, and treated as dead zones in the non-linear relative point graph, and direct mouse point control was possible using its moving distance and acceleration calculation. We used the active light sources to minimize the influence of the ambient light changes, so that the head mouse was not affected by the change in external light source. In a comparison between the head mouse and the gazing tracking mouse (Quick Glance), the above method resulted about 21% higher score on the clicking event experiment called "20 clicks", about 25% higher on the dasher experiment, and about 37% higher on on-screen keyboard test respectively, which concludes that the proposed head mouse has better performance than the standard mouse system.

  • PDF

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.