• Title/Summary/Keyword: Pupil Detection & Tracking

Search Result 37, Processing Time 0.03 seconds

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Visual Modeling and Content-based Processing for Video Data Storage and Delivery

  • Hwang Jae-Jeong;Cho Sang-Gyu
    • Journal of information and communication convergence engineering
    • /
    • v.3 no.1
    • /
    • pp.56-61
    • /
    • 2005
  • In this paper, we present a video rate control scheme for storage and delivery in which the time-varying viewing interests are controlled by human gaze. To track the gaze, the pupil's movement is detected using the three-step process : detecting face region, eye region, and pupil point. To control bit rates, the quantization parameter (QP) is changed by considering the static parameters, the video object priority derived from the pupil tracking, the target PSNR, and the weighted distortion value of the coder. As results, we achieved human interfaced visual model and corresponding region-of-interest rate control system.

Real-Time Eye Detection and Tracking Under Various Light Conditions

  • Park Ho Sik;Nam Kee Hwan;Seol Jeung Bo;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.862-866
    • /
    • 2004
  • Non-intrusive methods based on active remote IR illumination for eye tracking is important for many applications of vision-based man-machine interaction. One problem that has plagued those methods is their sensitivity to lighting condition change. This tends to significantly limit their scope of application. In this paper, we present a new real-time eye detection and tracking methodology that works under variable and realistic lighting conditions. Based on combining the bright-pupil effect resulted from IR light and the conventional appearance-based object recognition technique, our method can robustly track eyes when the pupils are not very bright due to significant external illumination interferences. The appearance model is incorporated in both eyes detection and tracking via the use of support vector machine and the mean shift tracking. Additional improvement is achieved from modifying the image acquisition apparatus including the illuminator and the camera.

  • PDF

A Development of Video Monitoring System on Real Time (실시간 영상감시 시스템 개발)

  • Cho, Hyun-Seob
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.8 no.2
    • /
    • pp.240-244
    • /
    • 2007
  • Non-intrusive methods based on active remote IR illumination fur eye tracking is important for many applications of vision-based man-machine interaction. One problem that has plagued those methods is their sensitivity to lighting condition change. This tends to significantly limit their scope of application. In this paper, we present a new real-time eye detection and tracking methodology that works under variable and realistic lighting conditions. Based on combining the bright-pupil effect resulted from IR light and the conventional appearance-based object recognition technique, our method can robustly track eyes when the pupils are not very bright due to significant external illumination interferences. The appearance model is incorporated in both eyes detection and tracking via the use of support vector machine and the mean shift tracking. Additional improvement is achieved from modifying the image acquisition apparatus including the illuminator and the camera.

  • PDF

Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.5
    • /
    • pp.575-580
    • /
    • 2010
  • Eye gaze as a form of input was primarily developed for users who are unable to use usual interaction devices such as keyboard and the mouse; however, with the increasing accuracy in eye gaze detection with decreasing cost of development, it tends to be a practical interaction method for able-bodied users in soon future as well. This paper explores a low-cost, robust, rotation and illumination independent eye gaze system for gaze enhanced user interfaces. We introduce two brand-new algorithms for fast and sub-pixel precise pupil center detection and 2D Eye Gaze estimation by means of deformable template matching methodology. In this paper, we propose a new algorithm based on the deformable angular integral search algorithm based on minimum intensity value to localize eyeball (iris outer boundary) in gray scale eye region images. Basically, it finds the center of the pupil in order to use it in our second proposed algorithm which is about 2D eye gaze tracking. First, we detect the eye regions by means of Intel OpenCV AdaBoost Haar cascade classifiers and assign the approximate size of eyeball depending on the eye region size. Secondly, using DAISMI (Deformable Angular Integral Search by Minimum Intensity) algorithm, pupil center is detected. Then, by using the percentage of black pixels over eyeball circle area, we convert the image into binary (Black and white color) for being used in the next part: DTBGE (Deformable Template based 2D Gaze Estimation) algorithm. Finally, using DTBGE algorithm, initial pupil center coordinates are assigned and DTBGE creates new pupil center coordinates and estimates the final gaze directions and eyeball size. We have performed extensive experiments and achieved very encouraging results. Finally, we discuss the effectiveness of the proposed method through several experimental results.

Development of Tracking Equipment for Real­Time Multiple Face Detection (실시간 복합 얼굴 검출을 위한 추적 장치 개발)

  • 나상동;송선희;나하선;김천석;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.7 no.8
    • /
    • pp.1823-1830
    • /
    • 2003
  • This paper presents a multiple face detector based on a robust pupil detection technique. The pupil detector uses active illumination that exploits the retro­reflectivity property of eyes to facilitate detection. The detection range of this method is appropriate for interactive desktop and kiosk applications. Once the location of the pupil candidates are computed, the candidates are filtered and grouped into pairs that correspond to faces using heuristic rules. To demonstrate the robustness of the face detection technique, a dual mode face tracker was developed, which is initialized with the most salient detected face. Recursive estimators are used to guarantee the stability of the process and combine the measurements from the multi­face detector and a feature correlation tracker. The estimated position of the face is used to control a pan­tilt servo mechanism in real­time, that moves the camera to keep the tracked face always centered in the image.

A STUDY ON PUPIL DETECTION AND TRACKING METHODS BASED ON IMAGE DATA ANALYSIS

  • CHOI, HANA;GIM, MINJUNG;YOON, SANGWON
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.25 no.4
    • /
    • pp.327-336
    • /
    • 2021
  • In this paper, we will introduce the image processing methods for the remote pupillary light reflex measurement using the video taken by a general smartphone camera without a special device such as an infrared camera. We propose an algorithm for estimate the size of the pupil that changes with light using image data analysis without a learning process. In addition, we will introduce the results of visualizing the change in the pupil size by removing noise from the recorded data of the pupil size measured for each frame of the video. We expect that this study will contribute to the construction of an objective indicator for remote pupillary light reflex measurement in the situation where non-face-to-face communication has become common due to COVID-19 and the demand for remote diagnosis is increasing.

Detection of Pupil Center using Projection Function and Hough Transform (프로젝션 함수와 허프 변환을 이용한 눈동자 중심점 찾기)

  • Choi, Yeon-Seok;Mun, Won-Ho;Kim, Cheol-Ki;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.10a
    • /
    • pp.167-170
    • /
    • 2010
  • In this paper, we proposed a novel algorithm to detect the center of pupil in frontal view face. This algorithm, at first, extract an eye region from the face image using integral projection function and variance projection function. In an eye region, detect the center of pupil positions using circular hough transform with sobel edge mask. The experimental results show good performance in detecting pupil center from FERET face image.

  • PDF

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Gaze Tracking Using a Modified Starburst Algorithm and Homography Normalization (수정 Starburst 알고리즘과 Homography Normalization을 이용한 시선추적)

  • Cho, Tai-Hoon;Kang, Hyun-Min
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1162-1170
    • /
    • 2014
  • In this paper, an accurate remote gaze tracking method with two cameras is presented using a modified Starburst algorithm and honography normalization. Starburst algorithm, which was originally developed for head-mounted systems, often fails in detecting accurate pupil centers in remote tracking systems with a larger field of view due to lots of noises. A region of interest area for pupil is found using template matching, and then only within this area Starburst algorithm is applied to yield pupil boundary candidate points. These are used in improved RANSAC ellipse fitting to produce the pupil center. For gaze estimation robust to head movement, an improved homography normalization using four LEDs and calibration based on high order polynomials is proposed. Finally, it is shown that accuracy and robustness of the system is improved using two cameras rather than one camera.