• Title/Summary/Keyword: Eye gaze information

Search Result 100, Processing Time 0.029 seconds

A Simple Eye Gaze Correction Scheme Using 3D Affine Transformation and Image In-painting Technique

  • Ko, Eunsang;Ho, Yo-Sung
    • Journal of Multimedia Information System
    • /
    • v.5 no.2
    • /
    • pp.83-86
    • /
    • 2018
  • Owing to high speed internet technologies, video conferencing systems are exploited in our home as well as work places using a laptop or a webcam. Although eye contact in the video conferencing system is significant, most systems do not support good eye contact due to improper locations of cameras. Several ideas have been proposed to solve the eye contact problem; however, some of them require complicated hardware configurations and expensive customized hardwares. In this paper, we propose a simple eye gaze correction method using the three-dimensional (3D) affine transformation. We also apply an image in-painting method to fill empty holes that are caused by round-off errors from the coordinate transformation. From experiments, we obtained visually improved results.

A Study on the Characteristics of Consumer Visual-Perceptional Information Acquisition in Commercial Facilities in Regard to its Construction of Space from Real-Time Eye Gaze Tracking (상업시설 공간구성의 실시간 시선추적에 나타난 소비자 시지각 정보획득 특성 연구)

  • Park, Sunmyung
    • Science of Emotion and Sensibility
    • /
    • v.21 no.2
    • /
    • pp.3-14
    • /
    • 2018
  • For satisfying consumer needs, commercial facilities require a variety of sale-related space expressions and eye-catching product arrangements; space composition can also be a direct marketing strategy. The human eye is the sensory organ that acquires the largest amount of information, and an analysis of visual information helps in understanding visual relations between . However, the existing studies are mostly focused on analysis of still frames in experimental images, and there is a lack of studies analyzing gaze information based on mobile images of commercial spaces. Therefore, this study analyzed emotional responses through gaze information of space users in reality using a video of a movement route through a commercial facility. The analysis targeted straight sections of the moving route; based on the data acquired, sectional characteristics of five gaze intensity ranges were examined. As a result, section A, the starting point of the route, had a low gaze intensity, while section B had the highest gaze intensity. This indicates that, starting in section B, the subjects explored the space in a stable way and needed time to adapt to the experimental video. In relation to space characteristics of the gaze-concentrated area, display formats of the right stores in 4 of 6 sections received greater attention. The gaze of consumers was mostly focused on props, and big gaze information was revealed in showcase display formats of the stores. In conclusion, this analysis method can provide highly useful direct design data about merchandise display and merchandise component arrangement based on consumer visual preference.

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Method for Automatic Switching Screen of OST-HMD using Gaze Depth Estimation (시선 깊이 추정 기법을 이용한 OST-HMD 자동 스위칭 방법)

  • Lee, Youngho;Shin, Choonsung
    • Smart Media Journal
    • /
    • v.7 no.1
    • /
    • pp.31-36
    • /
    • 2018
  • In this paper, we propose automatic screen on / off method of OST-HMD screen using gaze depth estimation technique. The proposed method uses MLP (Multi-layer Perceptron) to learn the user's gaze information and the corresponding distance of the object, and inputs the gaze information to estimate the distance. In the learning phase, eye-related features obtained using a wearable eye-tracker. These features are then entered into the Multi-layer Perceptron (MLP) for learning and model generation. In the inference step, eye - related features obtained from the eye tracker in real time input to the MLP to obtain the estimated depth value. Finally, we use the results of this calculation to determine whether to turn the display of the HMD on or off. A prototype was implemented and experiments were conducted to evaluate the feasibility of the proposed method.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

Human-Computer Interface using the Eye-Gaze Direction (눈의 응시 방향을 이용한 인간-컴퓨터간의 인터페이스에 관한 연구)

  • Kim, Do-Hyoung;Kim, Jea-Hean;Chung, Myung-Jin
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.38 no.6
    • /
    • pp.46-56
    • /
    • 2001
  • In this paper we propose an efficient approach for real-time eye-gaze tracking from image sequence and magnetic sensory information. The inputs to the eye-gaze tracking system are images taken by a camera and data from a magnetic sensor. The measuring data are sufficient to describe the eye and head movement, because the camera and the receiver of a magnetic sensor are stationary with respect to the head. Experimental result shows the validity of real time application aspect of the proposed system and also shows the feasibility of the system as using a new computer interface instead of the mouse.

  • PDF

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.