• Title/Summary/Keyword: Gaze tracking

Search Result 167, Processing Time 0.034 seconds

Pilot Gaze Tracking and ILS Landing Result Analysis using VR HMD based Flight Simulators (VR HMD 시뮬레이터를 활용한 조종사 시선 추적 및 착륙 절차 결과 분석)

  • Jeong, Gu Moon;Lee, Youngjae;Kwag, TaeHo;Lee, Jae-Woo
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.30 no.1
    • /
    • pp.44-49
    • /
    • 2022
  • This study performed precision instrument landing procedures for pilots with a commercial pilot license using VR HMD flight simulators, and assuming that the center of the pilot's gaze is in the front, 3-D.O.F. head tracking data and 2-D eye tracking of VR HMD worn by pilots gaze tracking was performed through. After that, AOI (Area of Interesting) was set for the instrument panel and external field of view of the cockpit to analyze how the pilot's gaze was distributed before and after the decision altitude. At the same time, the landing results were analyzed using the Localizer and G/S data as the pilot's precision instrument landing flight data. As a result, the pilot was quantitatively evaluated by reflecting the gaze tracking and the resulting landing result using a VR HMD simulator.

Development and Reinforcement for Learning with Gaze-Tracking Technology (Gaze-Tracking 기술을 통한 학습 집중력 향상 및 강화 서비스)

  • Jung, Si-Yeol;Moon, Tae-Jun;Lee, Yong-Taek;Kim, Sang-Yeop;Kim, Young-Jong
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2022.05a
    • /
    • pp.587-589
    • /
    • 2022
  • 본 서비스는 코로나 19 로 인한 비대면 수업에 따른 학생들의 학업성취도를 증진시키기 위한 것이다. 이를 위해서 비대면 수업동안의 사용자의 시선을 추적하여 몰입도를 분석한다. 사용 기술로는 사용자의 시선을 추적하는데 Gaze-Tracking 기술과 영상에서 수업에 있어 유의미한 영역을 분석하는 deeplabv3 기술을 사용한다. Gaze-Tracking 기술은 웹캠 등을 통하여 사용자가 화면의 어느 부분을 쳐다보고 있는지를 고개, 눈, 눈동자의 각도를 통하여 알아낸다. 해당 기술들을 활용하여 실시간 몰입도를 분석하여 알림을 제공한다. 수업이 종료되고 나서는 마지막에 몰입도 통계를 제공한다. 추가적으로 몰입도 향상을 도와주는 미니게임도 제공한다.

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

Adaptive Zoom-based Gaze Tracking for Enhanced Accuracy and Precision (정확도 및 정밀도 향상을 위한 적응형 확대 기반의 시선 추적 기법)

  • Song, Hyunjoo;Jo, Jaemin;Kim, Bohyoung;Seo, Jinwook
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.9
    • /
    • pp.610-615
    • /
    • 2015
  • The accuracy and precision of video-based remote gaze trackers is affected by numerous factors (e.g. the head movement of the participant). However, it is challenging to control all factors that have an influence, and doing so (e.g., using a chin-rest to control geometry) could lead to losing the benefit of using gaze trackers, i.e., the ecological validity of their unobtrusive nature. We propose an adaptive zoom-based gaze tracking technique, ZoomTrack that addresses this problem by improving the resolution of the gaze tracking results. Our approach magnifies a region-of-interest (ROI) and retrieves gaze points at a higher resolution under two different zooming modes: only when the gaze reaches the ROI (temporary) or whenever a participant stares at the stimuli (omnipresent). We compared these against the base case without magnification in a user study. The results are then used to summarize the advantages and limitations of our technique.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

User-Calibration Free Gaze Tracking System Model (사용자 캘리브레이션이 필요 없는 시선 추적 모델 연구)

  • Ko, Eun-Ji;Kim, Myoung-Jun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1096-1102
    • /
    • 2014
  • In remote gaze tracking system using infra-red LEDs, calibrating the position of reflected light is essential for computing pupil position in captured images. However, there are limitations in reducing errors because variable locations of head and unknown radius of cornea are involved in the calibration process as constants. This study purposes a gaze tracking method based on pupil-corneal reflection that does not require user-calibration. Our goal is to eliminate the correction process of glint positions, which require a prior calibration, so that the gaze calculation is simplified.

Compensation for Fast Mead Movements on Non-intrusive Eye Gaze Tracking System Using Kalman Filter (Kalman 필터를 이용한 비접촉식 응시점 추정 시스템에서의 빠른 머리 이동의 보정)

  • Kim, Soo-Chan;Yoo, Jae-Ha;Nam, Ki-Chang;Kim, Deok-Won
    • Proceedings of the KIEE Conference
    • /
    • 2005.05a
    • /
    • pp.33-35
    • /
    • 2005
  • We propose an eye gaze tracking system under natural head movements. The system consists of one CCD camera and two front-surface mirrors. The mirrors rotate to follow head movements in order to keep the eye within the view of the camera. However, the mirror controller cannot guarantee the fast head movements, because the frame rate is generally 30Hz. To overcome this problem, we applied Kalman predictor to estimate next eye position from the current eye image. In the results, our system allows the subjects head to move 50cm horizontally and 40cm vertically, with the speed about 10cm/sec and 6cm/sec, respectively. And spatial gaze resolutions are about 4.5 degree and 4.5 degree, respectively, and the gaze estimation accuracy is 92% under natural head movements.

  • PDF

A Gaze Tracking based on the Head Pose in Computer Monitor (얼굴 방향에 기반을 둔 컴퓨터 화면 응시점 추적)

  • 오승환;이희영
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.227-230
    • /
    • 2002
  • In this paper we concentrate on overall direction of the gaze based on a head pose for human computer interaction. To decide a gaze direction of user in a image, it is important to pick up facial feature exactly. For this, we binarize the input image and search two eyes and the mouth through the similarity of each block ( aspect ratio, size, and average gray value ) and geometric information of face at the binarized image. We create a imaginary plane on the line made by features of the real face and the pin hole of the camera to decide the head orientation. We call it the virtual facial plane. The position of a virtual facial plane is estimated through projected facial feature on the image plane. We find a gaze direction using the surface normal vector of the virtual facial plane. This study using popular PC camera will contribute practical usage of gaze tracking technology.

  • PDF

An Experimental Multimodal Command Control Interface toy Car Navigation Systems

  • Kim, Kyungnam;Ko, Jong-Gook;SeungHo choi;Kim, Jin-Young;Kim, Ki-Jung
    • Proceedings of the IEEK Conference
    • /
    • 2000.07a
    • /
    • pp.249-252
    • /
    • 2000
  • An experimental multimodal system combining natural input modes such as speech, lip movement, and gaze is proposed in this paper. It benefits from novel human-compute. interaction (HCI) modalities and from multimodal integration for tackling the problem of the HCI bottleneck. This system allows the user to select menu items on the screen by employing speech recognition, lip reading, and gaze tracking components in parallel. Face tracking is a supplementary component to gaze tracking and lip movement analysis. These key components are reviewed and preliminary results are shown with multimodal integration and user testing on the prototype system. It is noteworthy that the system equipped with gaze tracking and lip reading is very effective in noisy environment, where the speech recognition rate is low, moreover, not stable. Our long term interest is to build a user interface embedded in a commercial car navigation system (CNS).

  • PDF

Designing Real-time Observation System to Evaluate Driving Pattern through Eye Tracker

  • Oberlin, Kwekam Tchomdji Luther.;Jung, Euitay
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.421-431
    • /
    • 2022
  • The purpose of this research is to determine the point of fixation of the driver during the process of driving. Based on the results of this research, the driving instructor can make a judgement on what the trainee stare on the most. Traffic accidents have become a serious concern in modern society. Especially, the traffic accidents among unskilled and elderly drivers are at issue. A driver should put attention on the vehicles around, traffic signs, passersby, passengers, road situation and its dashboard. An eye-tracking-based application was developed to analyze the driver's gaze behavior. It is a prototype for real-time eye tracking for monitoring the point of interest of drivers in driving practice. In this study, the driver's attention was measured by capturing the movement of the eyes in real road driving conditions using these tools. As a result, dwelling duration time, entry time and the average of fixation of the eye gaze are leading parameters that could help us prove the idea of this study.