• 제목/요약/키워드: Eye-gaze

검색결과 244건 처리시간 0.023초

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • 한국컴퓨터정보학회논문지
    • /
    • 제21권10호
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

과학교사의 시선 공감 향상을 위한 시선 추적 기반 수업 컨설팅 전략 개발 (Development of Instruction Consulting Strategy for Improving Science Teacher's Gaze Empathy Using Eye-tracking)

  • 권승혁;권용주
    • 과학교육연구지
    • /
    • 제42권3호
    • /
    • pp.334-351
    • /
    • 2018
  • 과학 수업에서 학생에 대한 교사의 시선 공감은 학습 효과를 높이는데 효과적이라 여겨졌다. 이에 따라 시선 공감에 관한 연구들이 수행되었지만, 대부분의 연구들은 시선의 특징을 밝히는데 그치고 있다. 따라서 과학교사의 시선 공감수준을 높이기 위한 연구가 필요하다. 이에 이 연구는 과학교사의 시선 공감 향상을 위한 시선추적 기반의 수업 컨설팅 전략의 개발하고자 하였다. 이를 위해 교사의 시선 공감에 관련된 문헌을 선정하고 분석하여 컨설팅 전략을 고안한 후 전문가의 타당성 및 신뢰성 검증을 통해 문제점을 수정하여 전략을 개발하였다. 개발된 수업컨설팅 전략은 과학 수업 중 교사의 시선 공감을 향상시키기 위한 시선 추적 기반의 정량적 목표를 설정하였다. 또 컨설팅 단계는 컨설팅 준비, 교사 시선 공감 측정 및 분석, 시선 공감 교육 및 피드백, 시선 공감 향상 훈련, 컨설팅 결과 평가, 컨설팅 종료 등의 과정으로 구성하였다. 아울러 시선 추적을 이용한 시선 공감 측정 및 평가를 통해 컨설팅을 종료하거나 다시 반복하게 하였다. 이렇게 개발된 전략은 시선 공감 향상을 위한 수업 행동의 정량적 진단과 처방 중심의 대안을 제공한다는 점에서 가치가 존재하며, 교사의 수업 행동 분석을 통한 수업 전문성 향상에 기여할 수 있을 것이다.

A Simple Eye Gaze Correction Scheme Using 3D Affine Transformation and Image In-painting Technique

  • Ko, Eunsang;Ho, Yo-Sung
    • Journal of Multimedia Information System
    • /
    • 제5권2호
    • /
    • pp.83-86
    • /
    • 2018
  • Owing to high speed internet technologies, video conferencing systems are exploited in our home as well as work places using a laptop or a webcam. Although eye contact in the video conferencing system is significant, most systems do not support good eye contact due to improper locations of cameras. Several ideas have been proposed to solve the eye contact problem; however, some of them require complicated hardware configurations and expensive customized hardwares. In this paper, we propose a simple eye gaze correction method using the three-dimensional (3D) affine transformation. We also apply an image in-painting method to fill empty holes that are caused by round-off errors from the coordinate transformation. From experiments, we obtained visually improved results.

광각 및 협각 카메라를 이용한 시선 위치 추적 시스템 (Gaze Detection System by Wide and Narrow View Camera)

  • 박강령
    • 한국통신학회논문지
    • /
    • 제28권12C호
    • /
    • pp.1239-1249
    • /
    • 2003
  • 시선 위치 추적이란 현재 사용자가 쳐다보고 있는 위치를 컴퓨터 시각 인식 방법을 이용하여 파악하는 연구이다. 일반적으로 사용자가 모니터 상의 한 위치를 쳐다보기 위해서는 얼굴 및 눈동자를 동시에 움직이는 경향이 있다. 기존의 시선 위치 추적 시스템은 사용자의 얼굴 전체를 취득할 수 있는 단 하나의 광각 카메라를 이용하여 사용자의 얼굴 및 눈동자 움직임을 추적하였다. 그러나 이러한 경우, 광각 카메라 내에 포함된 눈동자 영상의 해상도가 많이 떨어져서 사용자의 눈동자 움직임을 정확하게 추적하지 못하는 문제점이 있었다. 그러므로 이 논문에서는 얼굴 영상을 취득하기 위한 광각 카메라 및 눈 영역을 확대하여 취득하는 협각 카메라, 즉 2개의 카메라를 이용하여 시선 위치추적 시스템을 구현하였다. 또한, 얼굴의 움직임 시 전체적인 위치가 변화될 눈동자의 움직임을 정확히 추적하기 위해, 협각 카메라에는 광각 카메라에서 추출된 얼굴 특징점의 위치를 기반으로 한 자동 초점 및 자동 상하/좌우 회전 기능이 포함되어 있다. 실험 결과, 얼굴 및 눈동자 움직임에 의한 모니터상의 시선 위치 정확도는 실험자가 눈동자는 고정으로 하고 얼굴만 움직여서 쳐다보는 경우에 약 3.1cm, 흐리고 얼굴 및 눈동자를 같이 움직여서 쳐다보는 경우에 약 3.57cm의 최소 자승 에러성능을 나타냈다. 처리 속도도 Pentium-IV 1.8 GHz에서 약 30ms 이내의 처리 속도를 나타냈다.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권4호
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

시선추적을 이용한 선택적 시각탐색에 대한 기초적 연구 - 백화점매장 공간 이미지를 중심으로 - (Basic Study on Selective Visual Search by Eyetracking - Image arond the Department Store Space -)

  • 박선명;김종하
    • 한국실내디자인학회논문집
    • /
    • 제24권2호
    • /
    • pp.125-133
    • /
    • 2015
  • Gaze induction characteristics in space vary depending on characteristics of spatial components and display. This study analyzed dominant eye-fixation characteristics of three zones of department store space. Eye-fixation characteristics depending on spatial components and positional relationship can be defined as follows. First, [**.jpg] was used as an extension in the process of storing the image photographed during image data processing for analysis in pixels and due to compressed storage of image data, the image produced with a clear boundary was stored in neutral colors. To remove this problem, the image used in operation was re-processed in black and white and stored in the [**.bmp] format with large capability, at the same time. As the result, the effort caused by unnecessary colors in the program operation process was corrected. Second, the gaze ratio to space area can be indicated as a strength of each gaze zone and when analyzing the gaze strength of the three zones, the left store was a zone with a "little strong" gaze strength of "102.8", the middle space was a zone with an "extremely weak" gaze strength of "89.6" and the right store was a zone with an "extremely strong" gaze strength of "117.2". Third, the IV section had a strong strength of gaze on the middle space and the right store and the V section showed a markedly strong strength of gaze on the left and right stores. This tendency was the same as the VI section with the strongest gaze strength and the right store had a little strong gaze strength than the left store.

Designing Real-time Observation System to Evaluate Driving Pattern through Eye Tracker

  • Oberlin, Kwekam Tchomdji Luther.;Jung, Euitay
    • 한국멀티미디어학회논문지
    • /
    • 제25권2호
    • /
    • pp.421-431
    • /
    • 2022
  • The purpose of this research is to determine the point of fixation of the driver during the process of driving. Based on the results of this research, the driving instructor can make a judgement on what the trainee stare on the most. Traffic accidents have become a serious concern in modern society. Especially, the traffic accidents among unskilled and elderly drivers are at issue. A driver should put attention on the vehicles around, traffic signs, passersby, passengers, road situation and its dashboard. An eye-tracking-based application was developed to analyze the driver's gaze behavior. It is a prototype for real-time eye tracking for monitoring the point of interest of drivers in driving practice. In this study, the driver's attention was measured by capturing the movement of the eyes in real road driving conditions using these tools. As a result, dwelling duration time, entry time and the average of fixation of the eye gaze are leading parameters that could help us prove the idea of this study.

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제3권4호
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2004년도 학술대회지
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF