• Title/Summary/Keyword: Gaze calibration

Search Result 25, Processing Time 0.021 seconds

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Non-intrusive Calibration for User Interaction based Gaze Estimation (사용자 상호작용 기반의 시선 검출을 위한 비강압식 캘리브레이션)

  • Lee, Tae-Gyun;Yoo, Jang-Hee
    • Journal of Software Assessment and Valuation
    • /
    • v.16 no.1
    • /
    • pp.45-53
    • /
    • 2020
  • In this paper, we describe a new method for acquiring calibration data using a user interaction process, which occurs continuously during web browsing in gaze estimation, and for performing calibration naturally while estimating the user's gaze. The proposed non-intrusive calibration is a tuning process over the pre-trained gaze estimation model to adapt to a new user using the obtained data. To achieve this, a generalized CNN model for estimating gaze is trained, then the non-intrusive calibration is employed to adapt quickly to new users through online learning. In experiments, the gaze estimation model is calibrated with a combination of various user interactions to compare the performance, and improved accuracy is achieved compared to existing methods.

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

User-Calibration Free Gaze Tracking System Model (사용자 캘리브레이션이 필요 없는 시선 추적 모델 연구)

  • Ko, Eun-Ji;Kim, Myoung-Jun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1096-1102
    • /
    • 2014
  • In remote gaze tracking system using infra-red LEDs, calibrating the position of reflected light is essential for computing pupil position in captured images. However, there are limitations in reducing errors because variable locations of head and unknown radius of cornea are involved in the calibration process as constants. This study purposes a gaze tracking method based on pupil-corneal reflection that does not require user-calibration. Our goal is to eliminate the correction process of glint positions, which require a prior calibration, so that the gaze calculation is simplified.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Gaze Discrimination for Computer Interface (컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Hwang, Suen-Ki;Kim, Moon-Hwan
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.3 no.1
    • /
    • pp.38-46
    • /
    • 2010
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about $5^{\circ}$horizontally and $8^{\circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

  • PDF

A Study on Real Time Gaze Discrimination System using GRNN (GRNN을 이용한 실시간 시선 식별 시스템에 관한 연구)

  • Lee Young-Sik;Bae Cheol-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.2
    • /
    • pp.322-329
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNS). With GRNNS, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10$\%$ improvement in classification error. The angular gaze accuracy is about $5^{circ}$horizontally and $8^{circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

3-Dimensional Calibration and Performance Evaluation Method for Pupil-labs Mobile Pupil Tracking Device (퓨필랩 모바일 동공 추적 장치를 위한 3차원 캘리브레이션 및 성능 평가 방법)

  • Mun, Ji-Hun;Shin, Dong-Won;Ho, Yo-Sung
    • Smart Media Journal
    • /
    • v.7 no.2
    • /
    • pp.15-22
    • /
    • 2018
  • Pupil tracking technologies can be used as an efficient information provider means that provides convenience to the user by connecting with a smart device. In this paper, we measure the distance of user gaze point using the pupil tracking device which produced by Pupil-labs, also shows the experimental result with analyzing accuracy and precision. Based on that the pupil gaze point location which tracked by pupil tracking device is compared with object target in terms of error. Since the mobile pupil tracking device is also one kind of camera, we have to perform the calibration before using the device. Not only generally used 2-dimensional calibration, but also 3-dimensional calibration method is explained. To get the improved accuracy of 2-dimensional calibration result, the 3-dimensional calibration set an imaginary plane and executes the calibration in various 3-dimensional spaces. To show the efficiency of 3-dimensional calibration, we analyze the experimental result. It also introduces various using methods and information that can be obtained through the device.