• Title/Summary/Keyword: Gaze point

Search Result 66, Processing Time 0.023 seconds

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Gaze Detection by Computing Facial Rotation and Translation (얼굴의 회전 및 이동 분석에 의한 응시 위치 파악)

  • Lee, Jeong-Jun;Park, Kang-Ryoung;Kim, Jai-Hie
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.39 no.5
    • /
    • pp.535-543
    • /
    • 2002
  • In this paper, we propose a new gaze detection method using 2-D facial images captured by a camera on top of the monitor. We consider only the facial rotation and translation and not the eyes' movements. The proposed method computes the gaze point caused by the facial rotation and the amount of the facial translation respectively, and by combining these two the final gaze point on a monitor screen can be obtained. We detected the gaze point caused by the facial rotation by using a neural network(a multi-layered perceptron) whose inputs are the 2-D geometric changes of the facial features' points and estimated the amount of the facial translation by image processing algorithms in real time. Experimental results show that the gaze point detection accuracy between the computed positions and the real ones is about 2.11 inches in RMS error when the distance between the user and a 19-inch monitor is about 50~70cm. The processing time is about 0.7 second with a Pentium PC(233MHz) and 320${\times}$240 pixel-size images.

Estimation of a Gaze Point in 3D Coordinates using Human Head Pose (휴먼 헤드포즈 정보를 이용한 3차원 공간 내 응시점 추정)

  • Shin, Chae-Rim;Yun, Sang-Seok
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.177-179
    • /
    • 2021
  • This paper proposes a method of estimating location of a target point at which an interactive robot gazes in an indoor space. RGB images are extracted from low-cost web-cams, user head pose is obtained from the face detection (Openface) module, and geometric configurations are applied to estimate the user's gaze direction in the 3D space. The coordinates of the target point at which the user stares are finally measured through the correlation between the estimated gaze direction and the plane on the table plane.

  • PDF

Reliability Measurement Technique of The Eye Tracking System Using Gaze Point Information (사용자 응시지점 정보기반 시선 추적 시스템 신뢰도 측정 기법)

  • Kim, Byoung-jin;Kang, Suk-ju
    • Journal of Digital Contents Society
    • /
    • v.17 no.5
    • /
    • pp.367-373
    • /
    • 2016
  • In this paper, we propose a novel method to improve the accuracy of eye trackers and how to analyze them. The proposed method extracts a user profile information created by extracting gaze coordinates and color information based on the exact pupil information, and then, it maintains a high accuracy in the display. In case that extract the user profile information, the changes of the accuracy for the gaze time also is estimated and the optimum parameter value is extracted. In the experimental results for the accuracy of the gaze detection, the accuracy was low if a user took a short time in a specific point. On the other hand, when taking more than two seconds, the accuracy was measured more than 80 %.

Object detection within the region of interest based on gaze estimation (응시점 추정 기반 관심 영역 내 객체 탐지)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.117-122
    • /
    • 2023
  • Gaze estimation, which automatically recognizes where a user is currently staring, and object detection based on estimated gaze point, can be a more accurate and efficient way to understand human visual behavior. in this paper, we propose a method to detect the objects within the region of interest around the gaze point. Specifically, after estimating the 3D gaze point, a region of interest based on the estimated gaze point is created to ensure that object detection occurs only within the region of interest. In our experiments, we compared the performance of general object detection, and the proposed object detection based on region of interest, and found that the processing time per frame was 1.4ms and 1.1ms, respectively, indicating that the proposed method was faster in terms of processing speed.

Extraction or gaze point on display based on EOG for general paralysis patient (전신마비 환자를 위한 EOG 기반 디스플레이 상의 응시 좌표 산출)

  • Lee, D.H.;Yu, J.H.;Kim, D.H.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • v.5 no.1
    • /
    • pp.87-93
    • /
    • 2011
  • This paper proposes a method for extraction of the gaze point on display using EOG(Electrooculography) signal. Based on the linear property of EOG signal, the proposed method corrects scaling difference, rotation difference and origin difference between coordinate of using EOG signal and coordinate on display, without adjustment using the head movement. The performance of the proposed method was evaluated by measuring the difference between extracted gaze point and displayed circle point on the monitor with 1680*1050 resolution. Experimental results show that the average distance errors at the gaze points are 3%(56pixel) on x-axis, 4%(47pixel) on y-axis, respectively. This method can be used to human computer interface of pointing device for general paralysis patients or HCI for VR game application.

A Study on the Aesthetic Identity of Modern Eroticism Fashion from the Perspective of Jacques Lacan's Unconscious Theory -Focusing on Jouissance & Gaze Theory- (자크 라캉 무의식이론의 관점에서 본 현대 에로티시즘 패션의 미적 정체성 연구 -주이상스와 응시론을 중심으로-)

  • Jungwon Yang;Misuk Lee
    • Journal of Fashion Business
    • /
    • v.27 no.2
    • /
    • pp.124-139
    • /
    • 2023
  • The purpose of this study is to determine the aesthetic identity of modern eroticism fashion in which the energy of desire is maximized through the 'jouissance' and 'gaze' of the unconscious theory of Jacques Lacan. The research method derived aesthetic identity after examining jouissance and gaze deeply related to eroticism in Lacan's theory of the unconscious by analyzing data of domestic and foreign monographs and prior research. Case analysis was limited to 2000 S/S to 2022 F/W. Based on prior research, it was analyzed mainly on clothing with eroticism characteristics of 'exposure, close contact, see-through, the conversion of underclothes into outer garments'. Results of the study are as follows. First, eroticism, which can be linked to Lacan's type of 'jouissance' with multiple meanings as the generating point of eroticism, has manifested itself in voyeuristic eroticism, fatale eroticism, masochistic eroticism, surplus eroticism, and sacred eroticism. Second, as eyes of unconscious desire, visual expression characteristics of 'gaze' appeared as anamorphosis, trompe l'oeil, and dépaysement. The identity of eroticism derived from Lacan's jouissance and the perspective of the desire gaze was divided into voyeuristic desire to gaze, fatal desire to gaze, masochistic desire to gaze, surplus desire to gaze, and sacred desire to gaze. Results of this study will expand theoretical horizons of eroticist fashion with a new interpretation of eroticism by combining Lacan's desire as a repressed and alienated subject within the human unconscious with the art that expands it.

Designing Real-time Observation System to Evaluate Driving Pattern through Eye Tracker

  • Oberlin, Kwekam Tchomdji Luther.;Jung, Euitay
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.421-431
    • /
    • 2022
  • The purpose of this research is to determine the point of fixation of the driver during the process of driving. Based on the results of this research, the driving instructor can make a judgement on what the trainee stare on the most. Traffic accidents have become a serious concern in modern society. Especially, the traffic accidents among unskilled and elderly drivers are at issue. A driver should put attention on the vehicles around, traffic signs, passersby, passengers, road situation and its dashboard. An eye-tracking-based application was developed to analyze the driver's gaze behavior. It is a prototype for real-time eye tracking for monitoring the point of interest of drivers in driving practice. In this study, the driver's attention was measured by capturing the movement of the eyes in real road driving conditions using these tools. As a result, dwelling duration time, entry time and the average of fixation of the eye gaze are leading parameters that could help us prove the idea of this study.

Glint Reconstruction Algorithm Using Homography in Gaze Tracking System (시선 추적 시스템에서의 호모그래피를 이용한 글린트 복원 알고리즘)

  • Ko, Eun-Ji;Kim, Myoung-Jun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.10
    • /
    • pp.2417-2426
    • /
    • 2014
  • Remote gaze tracking system calculates the gaze from captured images that reflect infra-red LEDs in cornea. Glint is the point that reflect infra-red LEDs to cornea. Recently, remote gaze tracking system uses a number of IR-LEDs to make the system less prone to head movement and eliminate calibration procedure. However, in some cases, some of glints are unable to spot. In this case, it is impossible to calculate gaze. This study examines patterns of glints that are difficult to detect in remote gaze tracking system. Afterward, we propose an algorithm to reconstruct positions of missing glints that are difficult to detect using other detected glints. Based on this algorithm, we increased the number of valid image frames in gaze tracking experiments, and reduce errors of gaze tracking results by correcting glint's distortion in the reconstruction phase.

Gaze Direction Estimation Method Using Support Vector Machines (SVMs) (Support Vector Machines을 이용한 시선 방향 추정방법)

  • Liu, Jing;Woo, Kyung-Haeng;Choi, Won-Ho
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.4
    • /
    • pp.379-384
    • /
    • 2009
  • A human gaze detection and tracing method is importantly required for HMI(Human-Machine-Interface) like a Human-Serving robot. This paper proposed a novel three-dimension (3D) human gaze estimation method by using a face recognition, an orientation estimation and SVMs (Support Vector Machines). 2,400 images with the pan orientation range of $-90^{\circ}{\sim}90^{\circ}$ and tilt range of $-40^{\circ}{\sim}70^{\circ}$ with intervals unit of $10^{\circ}$ were used. A stereo camera was used to obtain the global coordinate of the center point between eyes and Gabor filter banks of horizontal and vertical orientation with 4 scales were used to extract the facial features. The experiment result shows that the error rate of proposed method is much improved than Liddell's.