• Title/Summary/Keyword: 시선 위치 추적

Search Result 63, Processing Time 0.026 seconds

A Gaze Detection Technique Using a Monocular Camera System (단안 카메라 환경에서의 시선 위치 추적)

  • 박강령;김재희
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.10B
    • /
    • pp.1390-1398
    • /
    • 2001
  • 시선 위치 추적이란 사용자가 모니터 상의 어느 지점을 쳐다보고 있는 지를 파악해 내는 기술이다. 시선 위치를 파악하기 위해 본 논문에서는 2차원 카메라 영상으로부터 얼굴 영역 및 얼굴 특징점을 추출한다. 초기에 모니터상의 3 지점을 쳐다볼 때 얼굴 특징점들은 움직임의 변화를 나타내며, 이로부터 카메라 보정 및 매개변수 추정 방법을 이용하여 얼굴특징점의 3차원 위치를 추정한다. 이후 사용자가 모니터 상의 또 다른 지점을 쳐다볼 때 얼굴 특징점의 변화된 3차원 위치는 3차원 움직임 추정방법 및 아핀변환을 이용하여 구해낸다. 이로부터 변화된 얼굴 특징점 및 이러한 얼굴 특징점으로 구성된 얼굴평면이 구해지며, 이러한 평면의 법선으로부터 모니터 상의 시선위치를 구할 수 있다. 실험 결과 19인치 모니터를 사용하여 모니터와 사용자까지의 거리를 50∼70cm정도 유지하였을 때 약 2.08인치의 시선위치에러 성능을 얻었다. 이 결과는 Rikert의 논문에서 나타낸 시선위치추적 성능(5.08cm 에러)과 비슷한 결과를 나타낸다. 그러나 Rikert의 방법은 모니터와 사용자 얼굴까지의 거리는 항상 고정시켜야 한다는 단점이 있으며, 얼굴의 자연스러운 움직임(회전 및 이동)이 발생하는 경우 시선위치추적 에러가 증가되는 문제점이 있다. 동시에 그들의 방법은 사용자 얼굴의 뒤 배경에 복잡한 물체가 없는 것으로 제한조건을 두고 있으며 처리 시간이 상당히 오래 걸리는 문제점이 있다. 그러나 본 논문에서 제안하는 시선 위치 추적 방법은 배경이 복잡한 사무실 환경에서도 사용가능하며, 약 3초 이내의 처리 시간(200MHz Pentium PC)이 소요됨을 알 수 있었다.

  • PDF

Gaze Detection System by IR-LED based Camera (적외선 조명 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.4C
    • /
    • pp.494-504
    • /
    • 2004
  • The researches about gaze detection have been much developed with many applications. Most previous researches only rely on image processing algorithm, so they take much processing time and have many constraints. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.2 cm of RMS error.

Gaze Detection System using Real-time Active Vision Camera (실시간 능동 비전 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.12
    • /
    • pp.1228-1238
    • /
    • 2003
  • This paper presents a new and practical method based on computer vision for detecting the monitor position where the user is looking. In general, the user tends to move both his face and eyes in order to gaze at certain monitor position. Previous researches use only one wide view camera, which can capture a whole user's face. In such a case, the image resolution is too low and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with dual camera systems(a wide and a narrow view camera). In order to locate the user's eye position accurately, the narrow view camera has the functionalities of auto focusing and auto panning/tilting based on the detected 3D facial feature positions from the wide view camera. In addition, we use dual R-LED illuminators in order to detect facial features and especially eye features. As experimental results, we can implement the real-time gaze detection system and the gaze position accuracy between the computed positions and the real ones is about 3.44 cm of RMS error.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyun-Seob;Ryu, In-Ho;Kim, Hee-Sook
    • Proceedings of the KIEE Conference
    • /
    • 2005.07d
    • /
    • pp.2839-2842
    • /
    • 2005
  • 본 논문에서는 새로운 실시간 시선 추적 방식을 제안하고자한다. 기존의 시선추적 방식은 사용자가 머리를 조금만 움직여도 잘못된 결과를 얻을 수가 있었고 각각의 사용자에 대하여 교정 과정을 수행할 필요가 있었다. 따라서 제안된 시선 추적 방법은 적외선 조명과 Generalized Regression Neural Networks(GRNN)를 이용함으로써 교정 과정 없이 머리의 움직임이 큰 경우에도 견실하고 정확한 시선 추적을 가능하도록 하였다. GRNN을 사용함으로써 매핑기능은 원활하게 할 수 있었고, 머리의 움직임은 시선 매핑 기능에 의해 적절하게 시선추적에 반영되어 얼굴의 움직임이 있는 경우에도 시선추적이 가능토록 하였고, 매핑 기능을 일반화함으로써 각각의 교정과정을 생략 할 수 있게 하여 학습에 참여하지 않은 다른 사용자도 시선 추적을 가능케 하였다. 실험결과 얼굴의 움직임이 있는 경우에는 평균 90% 다른 사용자에 대해서는 평균 85%의 시선 추적 결과를 나타내었다.

  • PDF

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

Gaze Detection by Computing Facial and Eye Movement (얼굴 및 눈동자 움직임에 의한 시선 위치 추적)

  • 박강령
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.2
    • /
    • pp.79-88
    • /
    • 2004
  • Gaze detection is to locate the position on a monitor screen where a user is looking by computer vision. Gaze detection systems have numerous fields of application. They are applicable to the man-machine interface for helping the handicapped to use computers and the view control in three dimensional simulation programs. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.8 cm of RMS error.

Gaze Detection System by Wide and Narrow View Camera (광각 및 협각 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.12C
    • /
    • pp.1239-1249
    • /
    • 2003
  • Gaze detection is to locate the position on a monitor screen where a user is looking by computer vision. Previous gaze detection system uses a wide view camera, which can capture the whole face of user. However, the image resolution is too low with such a camera and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with a wide view camera and a narrow view camera. In order to detect the position of user's eye changed by facial movements, the narrow view camera has the functionalities of auto focusing and auto pan/tilt based on the detected 3D facial feature positions. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 3.1 cm of RMS error in case of Permitting facial movements and 3.57 cm in case of permitting facial and eye movement. The processing time is so short as to be implemented in real-time system(below 30 msec in Pentium -IV 1.8 GHz)

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Glint Reconstruction Algorithm Using Homography in Gaze Tracking System (시선 추적 시스템에서의 호모그래피를 이용한 글린트 복원 알고리즘)

  • Ko, Eun-Ji;Kim, Myoung-Jun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.10
    • /
    • pp.2417-2426
    • /
    • 2014
  • Remote gaze tracking system calculates the gaze from captured images that reflect infra-red LEDs in cornea. Glint is the point that reflect infra-red LEDs to cornea. Recently, remote gaze tracking system uses a number of IR-LEDs to make the system less prone to head movement and eliminate calibration procedure. However, in some cases, some of glints are unable to spot. In this case, it is impossible to calculate gaze. This study examines patterns of glints that are difficult to detect in remote gaze tracking system. Afterward, we propose an algorithm to reconstruct positions of missing glints that are difficult to detect using other detected glints. Based on this algorithm, we increased the number of valid image frames in gaze tracking experiments, and reduce errors of gaze tracking results by correcting glint's distortion in the reconstruction phase.

Development of Eye Tracker System for Early Childhood (유아용 시선 추적 장치의 개발 연구)

  • Lee, Byungho
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.7
    • /
    • pp.91-98
    • /
    • 2019
  • The purpose of this study was to develop and test an eye tracker focusing on early childhood participants, based on the characteristics of early childhood eye tracking studies. Eye tracking collects eye movement data of the subject, which provides scientific evidence of human cognition and thinking. The researcher built a Do It Yourself eye tracker camera module from general electronic components, and used Viewpoint analysis software from Arrington Research. The researcher compared the eye tracking data between the DIY eye tracker group and Tobii Pro eye tracker group, which provides a professional eye tracking system. Eye tracking data was collected from 52 five-year old children. The average proportion of valid trials between the two groups was compared with t test, and no significant difference was found. This result indicates that the DIY eye tracker can be used to collect valid eye tracking data from young children under certain research environment.