• Title/Summary/Keyword: Gaze-Tracking

Search Result 166, Processing Time 0.028 seconds

Visual Modeling and Content-based Processing for Video Data Storage and Delivery

  • Hwang Jae-Jeong;Cho Sang-Gyu
    • Journal of information and communication convergence engineering
    • /
    • v.3 no.1
    • /
    • pp.56-61
    • /
    • 2005
  • In this paper, we present a video rate control scheme for storage and delivery in which the time-varying viewing interests are controlled by human gaze. To track the gaze, the pupil's movement is detected using the three-step process : detecting face region, eye region, and pupil point. To control bit rates, the quantization parameter (QP) is changed by considering the static parameters, the video object priority derived from the pupil tracking, the target PSNR, and the weighted distortion value of the coder. As results, we achieved human interfaced visual model and corresponding region-of-interest rate control system.

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character (로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석)

  • Jang, Seyun;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.

Using POSTIT Eye Gaze Tracking in Real-time (POSTIT정보 이용한 실시간 눈동자 시선 추적)

  • Kim, Mi-Kyung;Choi, Yeon-Seok;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2012.05a
    • /
    • pp.750-753
    • /
    • 2012
  • A method detecting the position of eyes and tracking a gaze point of eyes in realtime using POSIT is suggested in this paper. This algorithm find out a candidate area of eyes using topological characteristics of eyes and then decides the center of eyes using physical characteristics of eyes. To find the eyes, a nose and a mouth are used for POSIT. The experimental results show that proposed method effectively performed detection of eyes in facial image in FERET databases and gave high performance when used for tracking a gaze point of eyes.

  • PDF

3-Dimensional Calibration and Performance Evaluation Method for Pupil-labs Mobile Pupil Tracking Device (퓨필랩 모바일 동공 추적 장치를 위한 3차원 캘리브레이션 및 성능 평가 방법)

  • Mun, Ji-Hun;Shin, Dong-Won;Ho, Yo-Sung
    • Smart Media Journal
    • /
    • v.7 no.2
    • /
    • pp.15-22
    • /
    • 2018
  • Pupil tracking technologies can be used as an efficient information provider means that provides convenience to the user by connecting with a smart device. In this paper, we measure the distance of user gaze point using the pupil tracking device which produced by Pupil-labs, also shows the experimental result with analyzing accuracy and precision. Based on that the pupil gaze point location which tracked by pupil tracking device is compared with object target in terms of error. Since the mobile pupil tracking device is also one kind of camera, we have to perform the calibration before using the device. Not only generally used 2-dimensional calibration, but also 3-dimensional calibration method is explained. To get the improved accuracy of 2-dimensional calibration result, the 3-dimensional calibration set an imaginary plane and executes the calibration in various 3-dimensional spaces. To show the efficiency of 3-dimensional calibration, we analyze the experimental result. It also introduces various using methods and information that can be obtained through the device.

Resolution Estimation Technique in Gaze Tracking System for HCI (HCI를 위한 시선추적 시스템에서 분해능의 추정기법)

  • Kim, Ki-Bong;Choi, Hyun-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.1
    • /
    • pp.20-27
    • /
    • 2021
  • Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.

Use of gaze entropy to evaluate situation awareness in emergency accident situations of nuclear power plant

  • Lee, Yejin;Jung, Kwang-Tae;Lee, Hyun-Chul
    • Nuclear Engineering and Technology
    • /
    • v.54 no.4
    • /
    • pp.1261-1270
    • /
    • 2022
  • This study was conducted to investigate the possibility of using gaze entropy to evaluate an operator's situation awareness in an emergency accident situation of a nuclear power plant. Gaze entropy can be an effective measure for evaluating an operator's situation awareness at a nuclear power plant because it can express gaze movement as a single comprehensive number. In order to determine the relationship between situation awareness and gaze entropy for an emergency accident situation of a nuclear power plant, an experiment was conducted to measure situation awareness and gaze entropy using simulators created for emergency accident situations LOCA, SGTR, SLB, and LOV. The experiment was to judge the accident situation of nuclear power plants presented in the simulator. The results showed that situation awareness and Shannon, dwell time, and Markov entropy had a significant negative correlation, while visual attention entropy (VAE) did not show any significant correlation with situation awareness. The results determined that Shannon entropy, dwell time entropy, and Markov entropy could be used as measures to evaluate situation awareness.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Gaze Discrimination for Computer Interface (컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Hwang, Suen-Ki;Kim, Moon-Hwan
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.3 no.1
    • /
    • pp.38-46
    • /
    • 2010
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about $5^{\circ}$horizontally and $8^{\circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

  • PDF

A Study on Real Time Gaze Discrimination System using GRNN (GRNN을 이용한 실시간 시선 식별 시스템에 관한 연구)

  • Lee Young-Sik;Bae Cheol-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.2
    • /
    • pp.322-329
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNS). With GRNNS, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10$\%$ improvement in classification error. The angular gaze accuracy is about $5^{circ}$horizontally and $8^{circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.