• Title/Summary/Keyword: 3D Gaze Tracking

Search Result 24, Processing Time 0.027 seconds

3D Gaze-based Stereo Image Interaction Technique (3차원 시선기반 입체영상 인터랙션 기법)

  • Ki, Jeong-Seok;Jeon, Kyeong-Won;Jo, Sang-Woo;Kwon, Yong-Moo;Kim, Sung-Kyu
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02a
    • /
    • pp.512-517
    • /
    • 2007
  • There are several researches on 2D gaze tracking techniques for the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or contents are not reported. The 3D display techniques are emerging now for the reality service. Moreover, the 3D interaction techniques are much more needed in the 3D contents service environments. This paper addresses gaze-based 3D interaction techniques on stereo display, such as parallax barrier or lenticular stereo display. This paper presents our researches on 3D gaze estimation and gaze-based interaction to stereo display.

  • PDF

3D Interaction Technique on Stereo Display System

  • Kwon, Yong-Moo;Ki, Jeong-Seok;Jeon, Kyeong-Won;Kim, Sung-Kyu
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2007.08b
    • /
    • pp.1235-1238
    • /
    • 2007
  • There are several researches on 2D gaze tracking techniques to the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or 3D contents are not reported. This paper presents a gaze-based 3D interaction technique on autostereoscopic display system.

  • PDF

3D Gaze Estimation and Interaction Technique (3차원 시선 추출 및 상호작용 기법)

  • Ki, Jeong-Seok;Jeon, Kyeong-Won;Kim, Sung-Kyu;Sohn, Kwang-Hoon;Kwon, Yong-Moo
    • Journal of Broadcast Engineering
    • /
    • v.11 no.4 s.33
    • /
    • pp.431-440
    • /
    • 2006
  • There are several researches on 2D gaze tracking techniques for the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or contents are not reported. The 3D display techniques are emerging now for the reality service. Moreover, the 3D interaction techniques are much more needed in the 3D contents service environments. This paper addresses gaze-based 3D interaction techniques on stereo display, such as parallax barrier or lenticular stereo display. This paper presents our researches on 3D gaze estimation and gaze-based interaction to stereo display.

Pilot Gaze Tracking and ILS Landing Result Analysis using VR HMD based Flight Simulators (VR HMD 시뮬레이터를 활용한 조종사 시선 추적 및 착륙 절차 결과 분석)

  • Jeong, Gu Moon;Lee, Youngjae;Kwag, TaeHo;Lee, Jae-Woo
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.30 no.1
    • /
    • pp.44-49
    • /
    • 2022
  • This study performed precision instrument landing procedures for pilots with a commercial pilot license using VR HMD flight simulators, and assuming that the center of the pilot's gaze is in the front, 3-D.O.F. head tracking data and 2-D eye tracking of VR HMD worn by pilots gaze tracking was performed through. After that, AOI (Area of Interesting) was set for the instrument panel and external field of view of the cockpit to analyze how the pilot's gaze was distributed before and after the decision altitude. At the same time, the landing results were analyzed using the Localizer and G/S data as the pilot's precision instrument landing flight data. As a result, the pilot was quantitatively evaluated by reflecting the gaze tracking and the resulting landing result using a VR HMD simulator.

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Wearable Robot System Enabling Gaze Tracking and 3D Position Acquisition for Assisting a Disabled Person with Disabled Limbs (시선위치 추적기법 및 3차원 위치정보 획득이 가능한 사지장애인 보조용 웨어러블 로봇 시스템)

  • Seo, Hyoung Kyu;Kim, Jun Cheol;Jung, Jin Hyung;Kim, Dong Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.37 no.10
    • /
    • pp.1219-1227
    • /
    • 2013
  • A new type of wearable robot is developed for a disabled person with disabled limbs, that is, a person who cannot intentionally move his/her legs and arms. This robot can enable the disabled person to grip an object using eye movements. A gaze tracking algorithm is employed to detect pupil movements by which the person observes the object to be gripped. By using this gaze tracking 2D information, the object is identified and the distance to the object is measured using a Kinect device installed on the robot shoulder. By using several coordinate transformations and a matching scheme, the final 3D information about the object from the base frame can be clearly identified, and the final position data is transmitted to the DSP-controlled robot controller, which enables the target object to be gripped successfully.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning (착용형 양안 시선추적기와 기계학습을 이용한 시선 초점 거리 추정방법 평가)

  • Shin, Choonsung;Lee, Gun;Kim, Youngmin;Hong, Jisoo;Hong, Sung-Hee;Kang, Hoonjong;Lee, Youngho
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.1
    • /
    • pp.19-26
    • /
    • 2018
  • In this paper, we propose a gaze depth estimation method based on a binocular eye tracker for virtual reality and augmented reality applications. The proposed gaze depth estimation method collects a wide range information of each eye from the eye tracker such as the pupil center, gaze direction, inter pupil distance. It then builds gaze estimation models using Multilayer perceptron which infers gaze depth with respect to the eye tracking information. Finally, we evaluated the gaze depth estimation method with 13 participants in two ways: the performance based on their individual models and the performance based on the generalized model. Through the evaluation, we found that the proposed estimation method recognized gaze depth with 90.1% accuracy for 13 individual participants and with 89.7% accuracy for including all participants.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.