• Title/Summary/Keyword: eye gaze tracking

Search Result 131, Processing Time 0.024 seconds

Resolution Estimation Technique in Gaze Tracking System for HCI (HCI를 위한 시선추적 시스템에서 분해능의 추정기법)

  • Kim, Ki-Bong;Choi, Hyun-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.1
    • /
    • pp.20-27
    • /
    • 2021
  • Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

A Study on Fashion Design Cognition Using Eye Tracking (시선 추적을 활용한 패션 디자인 인지에 관한 연구)

  • Lee, Shin-Young
    • Fashion & Textile Research Journal
    • /
    • v.23 no.3
    • /
    • pp.323-336
    • /
    • 2021
  • This study investigated the cognitive process of fashion design images through eye activity tracking. Differences in the cognitive process and gaze activity according to image elements were confirmed. The results of the study are as follows. First, a difference was found between groups in the gaze time for each section according to the model and design. Although model diversity is an important factor leading the interest of observers, the simplicity of the model was deemed more effective for observing the design. Second, the examination of the differences by segments regarding the gaze weight of the image area showed differences for each group. When a similar type of model is repeated, the proportion of face recognition decreases, and the proportion of design recognition time increases. Conversely, when the model diversity is high, the same amount of time is devoted to recognizing the model's face in all the processes. Additionally, there was a difference in the gaze activity in recognizing the same design according to the type of model. These results enabled the confirmation of the importance of the model as an image recognition factor in fashion design. In the fashion industry, it is important to find a cognitive factor that attracts and retains consumers' attention. If the design recognition effect is further maximized by finding service points to be utilized, the brand's sustainability is expected to be enhanced even in the rapidly changing fashion industry.

Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character (로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석)

  • Jang, Seyun;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.

Reliability Measurement Technique of The Eye Tracking System Using Gaze Point Information (사용자 응시지점 정보기반 시선 추적 시스템 신뢰도 측정 기법)

  • Kim, Byoung-jin;Kang, Suk-ju
    • Journal of Digital Contents Society
    • /
    • v.17 no.5
    • /
    • pp.367-373
    • /
    • 2016
  • In this paper, we propose a novel method to improve the accuracy of eye trackers and how to analyze them. The proposed method extracts a user profile information created by extracting gaze coordinates and color information based on the exact pupil information, and then, it maintains a high accuracy in the display. In case that extract the user profile information, the changes of the accuracy for the gaze time also is estimated and the optimum parameter value is extracted. In the experimental results for the accuracy of the gaze detection, the accuracy was low if a user took a short time in a specific point. On the other hand, when taking more than two seconds, the accuracy was measured more than 80 %.

Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR

  • Cho, Chul Woo;Lee, Ji Woo;Shin, Kwang Yong;Lee, Eui Chul;Park, Kang Ryoung;Lee, Heekyung;Cha, Jihun
    • ETRI Journal
    • /
    • v.34 no.4
    • /
    • pp.542-552
    • /
    • 2012
  • In this paper, a gaze estimation method is proposed for use with a large-sized display at a distance. Our research has the following four novelties: this is the first study on gaze-tracking for large-sized displays and large Z (viewing) distances; our gaze-tracking accuracy is not affected by head movements since the proposed method tracks the head by using a near infrared camera and an infrared light-emitting diode; the threshold for local binarization of the pupil area is adaptively determined by using a p-tile method based on circular edge detection irrespective of the eyelid or eyelash shadows; and accurate gaze position is calculated by using two support vector regressions without complicated calibrations for the camera, display, and user's eyes, in which the gaze positions and head movements are used as feature values. The root mean square error of gaze detection is calculated as $0.79^{\circ}$ for a 30-inch screen.

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu (입 벌림 인식과 팝업 메뉴를 이용한 시선추적 마우스 시스템 성능 개선)

  • Byeon, Ju Yeong;Jung, Keechul
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.12
    • /
    • pp.1454-1463
    • /
    • 2020
  • An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.

Correcting the gaze depth by using DNN (DNN을 이용한 응시 깊이 보정)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.123-129
    • /
    • 2023
  • if we know what we're looking at, we can get a lot of information. Due to the development of eye tracking, Information on gaze point can be obtained through software provided by various eye tracking equipments. However, it is difficult to estimate accurate information such as the actual gaze depth. If it is possible to calibrate the eye tracker with the actual gaze depth, it will enable the derivation of realistic and accurate results with reliable validity in various fields such as simulation, digital twin, VR, and more. Therefore, in this paper, we experiment with acquiring and calibrating raw gaze depth using an eye tracker and software. The experiment involves designing a Deep Neural Network (DNN) model and then acquiring gaze depth values provided by the software for specified distances from 300mm to 10,000mm. The acquired data is trained through the designed DNN model and calibrated to correspond to the actual gaze depth. In our experiments with the calibrated model, we were able to achieve actual gaze depth values of 297mm, 904mm, 1,485mm, 2,005mm, 3,011mm, 4,021mm, 4,972mm, 6,027mm, 7,026mm, 8,043mm, 9,021mm, and 10,076mm for the specified distances from 300mm to 10,000mm.

Gaze Tracking with Low-cost EOG Measuring Device (저가형 EOG 계측장치를 이용한 시선추적)

  • Jang, Seung-Tae;Lee, Jung-Hwan;Jang, Jae-Young;Chang, Won-Du
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.11
    • /
    • pp.53-60
    • /
    • 2018
  • This paper describes the experiments of gaze tracking utilizing a low-cost electrooculogram measuring device. The goal of the experiments is to verify whether the low-cost device can be used for a complicated human-computer interaction tool, such as the eye-writing. Two experiments are conducted for this goal: a simple gaze tracking of four directional eye-movements, and eye-writing-which is to draw letters or shapes in a virtual space. Eye-written alphabets were obtained by two PSL-iEOGs and an Arduino Uno; they were classified by dynamic positional warping after preprocessed by a wavelet function. The results show that the expected recognition accuracy of the four-directional recognition is close to 90% when noises are controlled, and the similar median accuracy (90.00%) was achieved for the eye-writing when the number of writing patterns are limited to five. In future works, additional algorithms for stabilizing the signal need to be developed.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF