• Title/Summary/Keyword: eye gaze

Search Result 244, Processing Time 0.028 seconds

The Effect of the Indication of Lengths and Angles on Classifying Triangles: Centering on Correct Answer Rate and Eye Movements (분류하기에서 길이와 직각 표기의 효과: 정답률과 안구운동 분석을 중심으로)

  • Yun, Ju Mi;Lee, Kwang-ho;Lee, Jae-Hak
    • Education of Primary School Mathematics
    • /
    • v.20 no.2
    • /
    • pp.163-175
    • /
    • 2017
  • The purpose of the study is to identify the effect of length and right angle indication on the understanding of the concept of the figure when presenting the task of classifying the plane figures. we recorded thirty three 4th grade students' performance with eye-tracking technologies and analyzed the correct answer rate and gaze duration. The findings from the study were as follows. First, correctness rate increased and Gaze duration decreased by marking length in isosceles triangle and equilateral triangle. Second, correctness rate increased and Gaze duration decreased by marking right angle in acute angle triangle and obtuse triangle. Based on these results, it is necessary to focus on measuring the understanding of the concept of the figure rather than measuring the students' ability to measure by expressing the length and angle when presenting the task of classifying the plane figures.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

Correcting the gaze depth by using DNN (DNN을 이용한 응시 깊이 보정)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.123-129
    • /
    • 2023
  • if we know what we're looking at, we can get a lot of information. Due to the development of eye tracking, Information on gaze point can be obtained through software provided by various eye tracking equipments. However, it is difficult to estimate accurate information such as the actual gaze depth. If it is possible to calibrate the eye tracker with the actual gaze depth, it will enable the derivation of realistic and accurate results with reliable validity in various fields such as simulation, digital twin, VR, and more. Therefore, in this paper, we experiment with acquiring and calibrating raw gaze depth using an eye tracker and software. The experiment involves designing a Deep Neural Network (DNN) model and then acquiring gaze depth values provided by the software for specified distances from 300mm to 10,000mm. The acquired data is trained through the designed DNN model and calibrated to correspond to the actual gaze depth. In our experiments with the calibrated model, we were able to achieve actual gaze depth values of 297mm, 904mm, 1,485mm, 2,005mm, 3,011mm, 4,021mm, 4,972mm, 6,027mm, 7,026mm, 8,043mm, 9,021mm, and 10,076mm for the specified distances from 300mm to 10,000mm.

Reliability Measurement Technique of The Eye Tracking System Using Gaze Point Information (사용자 응시지점 정보기반 시선 추적 시스템 신뢰도 측정 기법)

  • Kim, Byoung-jin;Kang, Suk-ju
    • Journal of Digital Contents Society
    • /
    • v.17 no.5
    • /
    • pp.367-373
    • /
    • 2016
  • In this paper, we propose a novel method to improve the accuracy of eye trackers and how to analyze them. The proposed method extracts a user profile information created by extracting gaze coordinates and color information based on the exact pupil information, and then, it maintains a high accuracy in the display. In case that extract the user profile information, the changes of the accuracy for the gaze time also is estimated and the optimum parameter value is extracted. In the experimental results for the accuracy of the gaze detection, the accuracy was low if a user took a short time in a specific point. On the other hand, when taking more than two seconds, the accuracy was measured more than 80 %.

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Method for Automatic Switching Screen of OST-HMD using Gaze Depth Estimation (시선 깊이 추정 기법을 이용한 OST-HMD 자동 스위칭 방법)

  • Lee, Youngho;Shin, Choonsung
    • Smart Media Journal
    • /
    • v.7 no.1
    • /
    • pp.31-36
    • /
    • 2018
  • In this paper, we propose automatic screen on / off method of OST-HMD screen using gaze depth estimation technique. The proposed method uses MLP (Multi-layer Perceptron) to learn the user's gaze information and the corresponding distance of the object, and inputs the gaze information to estimate the distance. In the learning phase, eye-related features obtained using a wearable eye-tracker. These features are then entered into the Multi-layer Perceptron (MLP) for learning and model generation. In the inference step, eye - related features obtained from the eye tracker in real time input to the MLP to obtain the estimated depth value. Finally, we use the results of this calculation to determine whether to turn the display of the HMD on or off. A prototype was implemented and experiments were conducted to evaluate the feasibility of the proposed method.

A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled (상지장애인을 위한 시선 인터페이스에서 포인터 실행 방법의 오작동 비교 분석을 통한 Eye-Voice 방식의 제안)

  • Park, Joo Hyun;Park, Mi Hyun;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.4
    • /
    • pp.566-573
    • /
    • 2020
  • Computers are the most common tool when using the Internet and utilizing a mouse to select and execute objects. Eye tracking technology is welcomed as an alternative technology to help control computers for users who cannot use their hands due to their disabilities. However, the pointer execution method of the existing eye tracking technique causes many malfunctions. Therefore, in this paper, we developed a gaze tracking interface that combines voice commands to solve the malfunction problem when the upper limb disabled uses the existing gaze tracking technology to execute computer menus and objects. Usability verification was conducted through comparative experiments regarding the improvements of the malfunction. The upper limb disabled who are hand-impaired use eye tracking technology to move the pointer and utilize the voice commands, such as, "okay" while browsing the computer screen for instant clicks. As a result of the comparative experiments on the reduction of the malfunction of pointer execution with the existing gaze interfaces, we verified that our system, Eye-Voice, reduced the malfunction rate of pointer execution and is effective for the upper limb disabled to use.

EOG-based User-independent Gaze Recognition using Wavelet Coefficients and Dynamic Positional Warping (웨이블릿 계수와 Dynamic Positional Warping을 통한 EOG기반의 사용자 독립적 시선인식)

  • Chang, Won-Du;Im, Chang-Hwan
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.9
    • /
    • pp.1119-1130
    • /
    • 2018
  • Writing letters or patterns on a virtual space by moving a person's gaze is called "eye writing," which is a promising tool for various human-computer interface applications. This paper investigates the use of conventional eye writing recognition algorithms for the purpose of user-independent recognition of eye-written characters. Two algorithms are presented to build the user-independent system: eye-written region extraction using wavelet coefficients and template generation. The experimental results of the proposed system demonstrated that with dynamic positional warping, an F1 score of 79.61% was achieved for 12 eye-written patterns, thereby indicating the possibility of user-independent use of eye writing.

The Effect of Gaze Fixation Induction Method on Visual Field Testing (시선 고정 유도방법이 시야 검사에 미치는 영향)

  • Lee, Jihyung;Choi, Younggeun;Yang, Xiaopeng;Lee, Nahyun;Oh, Gunhee;Kim, Young Gyun;Kang, Jaheon;You, Heecheon
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.42 no.6
    • /
    • pp.412-420
    • /
    • 2016
  • A visual field tester using a fixation target with a lack of distinctiveness decreases accuracy and usability in visual field testing. The present study is intended to develop various induction methods of gaze fixation for effective visual field testing. Proposed were four new gaze fixation induction methods (color changing dot; alphanumeric characters; flashing black dot; and bulls eye and cross hair, BECH) by considering visual attention factors such as color, meaning, flashing, and shape and the proposed methods were compared with the existing black dot (BD) method in terms of gaze fixation performance and subjective satisfaction by 32 participants in their 20s to 30s. BECH was found most preferred by increasing gaze fixation performance by 4.8% and subjective satisfaction by 0.4 to 2.0 in a 7-point scale compared to BD. BECH can be applied to tests such as visual field testing and macular pigment optical density testing in which gaze fixation is crucial for accuracy and usability.