• Title/Summary/Keyword: eye gaze

Search Result 242, Processing Time 0.025 seconds

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Development of Instruction Consulting Strategy for Improving Science Teacher's Gaze Empathy Using Eye-tracking (과학교사의 시선 공감 향상을 위한 시선 추적 기반 수업 컨설팅 전략 개발)

  • Kwon, Seung-Hyuk;Kwon, Yong-Ju
    • Journal of Science Education
    • /
    • v.42 no.3
    • /
    • pp.334-351
    • /
    • 2018
  • Teacher's gaze empathy for students in science class is considered to be effective in enhancing the learning effect. Thus, studies on gaze empathy have been conducted, but most of the studies are just to reveal the characteristics of gaze. Therefore, it is necessary to deal with a research to raise the level of science teacher's gaze empathy. The purpose of this study is to develop an instruction consulting strategy based on eye tracking for improving science teachers' gaze empathy. In this study, we selected and analyzed relevant literature on teacher's gaze empathy. We also designed a consulting strategy and then revised the design through expert reviews on validity and reliability. The developed consulting strategy was aimed to improve science teacher's gaze empathy and set quantitative goal based on eye tracking. The consulting strategy consisted of six steps: preparation for consulting, measurement and analysis of teacher's gaze empathy, instruction and feedback of gaze empathy, training for improving gaze empathy, evaluation of consulting result, and completion of the consulting. In addition, the consultation was completed or repeated again through the measurement and evaluation of gaze empathy using eye tracking. The developed consulting strategy has a value in that it provides an alternative with quantitative diagnosis and prescription for improving gaze empathy. The strategy can contribute to enhance teacher professional competency through the analysis of teaching behavior.

A Simple Eye Gaze Correction Scheme Using 3D Affine Transformation and Image In-painting Technique

  • Ko, Eunsang;Ho, Yo-Sung
    • Journal of Multimedia Information System
    • /
    • v.5 no.2
    • /
    • pp.83-86
    • /
    • 2018
  • Owing to high speed internet technologies, video conferencing systems are exploited in our home as well as work places using a laptop or a webcam. Although eye contact in the video conferencing system is significant, most systems do not support good eye contact due to improper locations of cameras. Several ideas have been proposed to solve the eye contact problem; however, some of them require complicated hardware configurations and expensive customized hardwares. In this paper, we propose a simple eye gaze correction method using the three-dimensional (3D) affine transformation. We also apply an image in-painting method to fill empty holes that are caused by round-off errors from the coordinate transformation. From experiments, we obtained visually improved results.

Gaze Detection System by Wide and Narrow View Camera (광각 및 협각 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.12C
    • /
    • pp.1239-1249
    • /
    • 2003
  • Gaze detection is to locate the position on a monitor screen where a user is looking by computer vision. Previous gaze detection system uses a wide view camera, which can capture the whole face of user. However, the image resolution is too low with such a camera and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with a wide view camera and a narrow view camera. In order to detect the position of user's eye changed by facial movements, the narrow view camera has the functionalities of auto focusing and auto pan/tilt based on the detected 3D facial feature positions. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 3.1 cm of RMS error in case of Permitting facial movements and 3.57 cm in case of permitting facial and eye movement. The processing time is so short as to be implemented in real-time system(below 30 msec in Pentium -IV 1.8 GHz)

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Basic Study on Selective Visual Search by Eyetracking - Image arond the Department Store Space - (시선추적을 이용한 선택적 시각탐색에 대한 기초적 연구 - 백화점매장 공간 이미지를 중심으로 -)

  • Park, Sun-Myung;Kim, Jong-Ha
    • Korean Institute of Interior Design Journal
    • /
    • v.24 no.2
    • /
    • pp.125-133
    • /
    • 2015
  • Gaze induction characteristics in space vary depending on characteristics of spatial components and display. This study analyzed dominant eye-fixation characteristics of three zones of department store space. Eye-fixation characteristics depending on spatial components and positional relationship can be defined as follows. First, [**.jpg] was used as an extension in the process of storing the image photographed during image data processing for analysis in pixels and due to compressed storage of image data, the image produced with a clear boundary was stored in neutral colors. To remove this problem, the image used in operation was re-processed in black and white and stored in the [**.bmp] format with large capability, at the same time. As the result, the effort caused by unnecessary colors in the program operation process was corrected. Second, the gaze ratio to space area can be indicated as a strength of each gaze zone and when analyzing the gaze strength of the three zones, the left store was a zone with a "little strong" gaze strength of "102.8", the middle space was a zone with an "extremely weak" gaze strength of "89.6" and the right store was a zone with an "extremely strong" gaze strength of "117.2". Third, the IV section had a strong strength of gaze on the middle space and the right store and the V section showed a markedly strong strength of gaze on the left and right stores. This tendency was the same as the VI section with the strongest gaze strength and the right store had a little strong gaze strength than the left store.

Designing Real-time Observation System to Evaluate Driving Pattern through Eye Tracker

  • Oberlin, Kwekam Tchomdji Luther.;Jung, Euitay
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.421-431
    • /
    • 2022
  • The purpose of this research is to determine the point of fixation of the driver during the process of driving. Based on the results of this research, the driving instructor can make a judgement on what the trainee stare on the most. Traffic accidents have become a serious concern in modern society. Especially, the traffic accidents among unskilled and elderly drivers are at issue. A driver should put attention on the vehicles around, traffic signs, passersby, passengers, road situation and its dashboard. An eye-tracking-based application was developed to analyze the driver's gaze behavior. It is a prototype for real-time eye tracking for monitoring the point of interest of drivers in driving practice. In this study, the driver's attention was measured by capturing the movement of the eyes in real road driving conditions using these tools. As a result, dwelling duration time, entry time and the average of fixation of the eye gaze are leading parameters that could help us prove the idea of this study.

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF