• Title/Summary/Keyword: pupil center detection

Search Result 20, Processing Time 0.023 seconds

A New Circle Detection Algorithm for Pupil and Iris Segmentation from the Occluded RGB images

  • Hong Kyung-Ho
    • International Journal of Contents
    • /
    • v.2 no.3
    • /
    • pp.22-26
    • /
    • 2006
  • In this paper we introduce a new circle detection algorithm for occluded on/off pupil and iris boundary extraction. The proposed algorithm employs 7-step processing to detect a center and radius of occluded on/off eye images using the property of the chords. The algorithm deals with two types of occluded pupil and iris boundary information; one is composed of circle-shaped, incomplete objects, which is called occluded on iris images and the other type consists of arc objects in which circular information has partially disappeared, called occluded off iris images. This method shows that the center and radius of iris boundary can be detected from as little as one-third of the occluded on/off iris information image. It is also shown that the proposed algorithm computed the center and radius of the incomplete iris boundary information which has partially occluded and disappeared. Experimental results on RGB images and IR images show that the proposed method has encouraging performance of boundary detection for pupil and iris segmentation. The experimental results show satisfactorily the detection of circle from incomplete circle shape information which is occluded as well as the detection of pupil/iris boundary circle of the occluded on/off image.

  • PDF

A Fast Pupil Detection Using Geometric Properties of Circular Objects (원형 객체의 기하학적 특성을 이용한 고속 동공 검출)

  • Kwak, Noyoon
    • Journal of Digital Convergence
    • /
    • v.11 no.2
    • /
    • pp.215-220
    • /
    • 2013
  • They are well-known geometric properties of a circle that the perpendicular bisector of a chord passes through the center of a circle, and the intersection of the perpendicular bisectors of any two chords is its center. This paper is related to a fast pupil detection method capable of detecting the center and the radius of a pupil using these geometric properties at high speed when detecting the pupil region for iris segmentation. The proposed method is characterized as rapidly detecting the center and the radius of the pupil, extracting the candidate points of the circle in human eye images using morphological operations, and finding two chords using four points on the circular edge, and taking the intersection of the perpendicular bisectors of these two chords for its center. The proposed method can not only detect the center and the radius of a pupil rapidly but also find partially occluded pupils in human eye images.

Detection of Pupil Center using Projection Function and Hough Transform (프로젝션 함수와 허프 변환을 이용한 눈동자 중심점 찾기)

  • Choi, Yeon-Seok;Mun, Won-Ho;Kim, Cheol-Ki;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.10a
    • /
    • pp.167-170
    • /
    • 2010
  • In this paper, we proposed a novel algorithm to detect the center of pupil in frontal view face. This algorithm, at first, extract an eye region from the face image using integral projection function and variance projection function. In an eye region, detect the center of pupil positions using circular hough transform with sobel edge mask. The experimental results show good performance in detecting pupil center from FERET face image.

  • PDF

Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.5
    • /
    • pp.575-580
    • /
    • 2010
  • Eye gaze as a form of input was primarily developed for users who are unable to use usual interaction devices such as keyboard and the mouse; however, with the increasing accuracy in eye gaze detection with decreasing cost of development, it tends to be a practical interaction method for able-bodied users in soon future as well. This paper explores a low-cost, robust, rotation and illumination independent eye gaze system for gaze enhanced user interfaces. We introduce two brand-new algorithms for fast and sub-pixel precise pupil center detection and 2D Eye Gaze estimation by means of deformable template matching methodology. In this paper, we propose a new algorithm based on the deformable angular integral search algorithm based on minimum intensity value to localize eyeball (iris outer boundary) in gray scale eye region images. Basically, it finds the center of the pupil in order to use it in our second proposed algorithm which is about 2D eye gaze tracking. First, we detect the eye regions by means of Intel OpenCV AdaBoost Haar cascade classifiers and assign the approximate size of eyeball depending on the eye region size. Secondly, using DAISMI (Deformable Angular Integral Search by Minimum Intensity) algorithm, pupil center is detected. Then, by using the percentage of black pixels over eyeball circle area, we convert the image into binary (Black and white color) for being used in the next part: DTBGE (Deformable Template based 2D Gaze Estimation) algorithm. Finally, using DTBGE algorithm, initial pupil center coordinates are assigned and DTBGE creates new pupil center coordinates and estimates the final gaze directions and eyeball size. We have performed extensive experiments and achieved very encouraging results. Finally, we discuss the effectiveness of the proposed method through several experimental results.

A STUDY ON PUPIL DETECTION AND TRACKING METHODS BASED ON IMAGE DATA ANALYSIS

  • CHOI, HANA;GIM, MINJUNG;YOON, SANGWON
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.25 no.4
    • /
    • pp.327-336
    • /
    • 2021
  • In this paper, we will introduce the image processing methods for the remote pupillary light reflex measurement using the video taken by a general smartphone camera without a special device such as an infrared camera. We propose an algorithm for estimate the size of the pupil that changes with light using image data analysis without a learning process. In addition, we will introduce the results of visualizing the change in the pupil size by removing noise from the recorded data of the pupil size measured for each frame of the video. We expect that this study will contribute to the construction of an objective indicator for remote pupillary light reflex measurement in the situation where non-face-to-face communication has become common due to COVID-19 and the demand for remote diagnosis is increasing.

Iris Localization using the Pupil Center Point based on Deep Learning in RGB Images (RGB 영상에서 딥러닝 기반 동공 중심점을 이용한 홍채 검출)

  • Lee, Tae-Gyun;Yoo, Jang-Hee
    • Journal of Software Assessment and Valuation
    • /
    • v.16 no.2
    • /
    • pp.135-142
    • /
    • 2020
  • In this paper, we describe the iris localization method in RGB images. Most of the iris localization methods are developed for infrared images, thus an iris localization method in RGB images is required for various applications. The proposed method consists of four stages: i) detection of the candidate irises using circular Hough transform (CHT) from an input image, ii) detection of a pupil center based on deep learning, iii) determine the iris using the pupil center, and iv) correction of the iris region. The candidate irises are detected in the order of the number of intersections of the center point candidates after generating the Hough space, and the iris in the candidates is determined based on the detected pupil center. Also, the error due to distortion of the iris shape is corrected by finding a new boundary point based on the detected iris center. In experiments, the proposed method has an improved accuracy about 27.4% compared to the CHT method.

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Design and Implementation of Eye-Gaze Estimation Algorithm based on Extraction of Eye Contour and Pupil Region (눈 윤곽선과 눈동자 영역 추출 기반 시선 추정 알고리즘의 설계 및 구현)

  • Yum, Hyosub;Hong, Min;Choi, Yoo-Joo
    • The Journal of Korean Association of Computer Education
    • /
    • v.17 no.2
    • /
    • pp.107-113
    • /
    • 2014
  • In this study, we design and implement an eye-gaze estimation system based on the extraction of eye contour and pupil region. In order to effectively extract the contour of the eye and region of pupil, the face candidate regions were extracted first. For the detection of face, YCbCr value range for normal Asian face color was defined by the pre-study of the Asian face images. The biggest skin color region was defined as a face candidate region and the eye regions were extracted by applying the contour and color feature analysis method to the upper 50% region of the face candidate region. The detected eye region was divided into three segments and the pupil pixels in each pupil segment were counted. The eye-gaze was determined into one of three directions, that is, left, center, and right, by the number of pupil pixels in three segments. In the experiments using 5,616 images of 20 test subjects, the eye-gaze was estimated with about 91 percent accuracy.

  • PDF

Real-time pupil center detection for gaze tracking (시선추적을 위한 실시간 동공 중심 검출)

  • Lee, Gyung-Ju;Kim, Gye-Young
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2014.01a
    • /
    • pp.59-61
    • /
    • 2014
  • 본 논문에서는 단일의 카메라로부터 획득한 영상에 있는 동공 중심을 실시간으로 검출하는 알고리즘을 제안한다. 제안하는 방법은 원에서 현의 수직이등분선은 그 원의 중심을 지난다는 사실을 이용하여 동공의 현을 찾고 동공 중심을 계산하는 것이다. 먼저 VPF(Variance Projection Function)을 이용해 일차적으로 동공 중심을 탐지한다. 탐지된 중심점을 기준으로 원을 탐색하여 정확한 동공 중심점을 찾는다. 실험을 통하여 제안한 방법은 높은 검출율과 처리시간 관점에서 우수함을 보인다.

  • PDF

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.