• Title/Summary/Keyword: Gaze

Search Result 532, Processing Time 0.034 seconds

Deep Learning-based Gaze Direction Vector Estimation Network Integrated with Eye Landmark Localization (딥 러닝 기반의 눈 랜드마크 위치 검출이 통합된 시선 방향 벡터 추정 네트워크)

  • Joo, Heeyoung;Ko, Min-Soo;Song, Hyok
    • Journal of Broadcast Engineering
    • /
    • v.26 no.6
    • /
    • pp.748-757
    • /
    • 2021
  • In this paper, we propose a gaze estimation network in which eye landmark position detection and gaze direction vector estimation are integrated into one deep learning network. The proposed network uses the Stacked Hourglass Network as a backbone structure and is largely composed of three parts: a landmark detector, a feature map extractor, and a gaze direction estimator. The landmark detector estimates the coordinates of 50 eye landmarks, and the feature map extractor generates a feature map of the eye image for estimating the gaze direction. And the gaze direction estimator estimates the final gaze direction vector by combining each output result. The proposed network was trained using virtual synthetic eye images and landmark coordinate data generated through the UnityEyes dataset, and the MPIIGaze dataset consisting of real human eye images was used for performance evaluation. Through the experiment, the gaze estimation error showed a performance of 3.9, and the estimation speed of the network was 42 FPS (Frames per second).

Use of gaze entropy to evaluate situation awareness in emergency accident situations of nuclear power plant

  • Lee, Yejin;Jung, Kwang-Tae;Lee, Hyun-Chul
    • Nuclear Engineering and Technology
    • /
    • v.54 no.4
    • /
    • pp.1261-1270
    • /
    • 2022
  • This study was conducted to investigate the possibility of using gaze entropy to evaluate an operator's situation awareness in an emergency accident situation of a nuclear power plant. Gaze entropy can be an effective measure for evaluating an operator's situation awareness at a nuclear power plant because it can express gaze movement as a single comprehensive number. In order to determine the relationship between situation awareness and gaze entropy for an emergency accident situation of a nuclear power plant, an experiment was conducted to measure situation awareness and gaze entropy using simulators created for emergency accident situations LOCA, SGTR, SLB, and LOV. The experiment was to judge the accident situation of nuclear power plants presented in the simulator. The results showed that situation awareness and Shannon, dwell time, and Markov entropy had a significant negative correlation, while visual attention entropy (VAE) did not show any significant correlation with situation awareness. The results determined that Shannon entropy, dwell time entropy, and Markov entropy could be used as measures to evaluate situation awareness.

Non-intrusive Calibration for User Interaction based Gaze Estimation (사용자 상호작용 기반의 시선 검출을 위한 비강압식 캘리브레이션)

  • Lee, Tae-Gyun;Yoo, Jang-Hee
    • Journal of Software Assessment and Valuation
    • /
    • v.16 no.1
    • /
    • pp.45-53
    • /
    • 2020
  • In this paper, we describe a new method for acquiring calibration data using a user interaction process, which occurs continuously during web browsing in gaze estimation, and for performing calibration naturally while estimating the user's gaze. The proposed non-intrusive calibration is a tuning process over the pre-trained gaze estimation model to adapt to a new user using the obtained data. To achieve this, a generalized CNN model for estimating gaze is trained, then the non-intrusive calibration is employed to adapt quickly to new users through online learning. In experiments, the gaze estimation model is calibrated with a combination of various user interactions to compare the performance, and improved accuracy is achieved compared to existing methods.

3D Interaction Technique on Stereo Display System

  • Kwon, Yong-Moo;Ki, Jeong-Seok;Jeon, Kyeong-Won;Kim, Sung-Kyu
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2007.08b
    • /
    • pp.1235-1238
    • /
    • 2007
  • There are several researches on 2D gaze tracking techniques to the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or 3D contents are not reported. This paper presents a gaze-based 3D interaction technique on autostereoscopic display system.

  • PDF

Smartphone Addiction Detection Based Emotion Detection Result Using Random Forest (랜덤 포레스트를 이용한 감정인식 결과를 바탕으로 스마트폰 중독군 검출)

  • Lee, Jin-Kyu;Kang, Hyeon-Woo;Kang, Hang-Bong
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.237-243
    • /
    • 2015
  • Recently, eight out of ten people have smartphone in Korea. Also, many applications of smartphone have increased. So, smartphone addiction has become a social issue. Especially, many people in smartphone addiction can't control themselves. Sometimes they don't realize that they are smartphone addiction. Many studies, mostly surveys, have been conducted to diagnose smartphone addiction, e.g. S-measure. In this paper, we suggest how to detect smartphone addiction based on ECG and Eye Gaze. We measure the signals of ECG from the Shimmer and the signals of Eye Gaze from the smart eye when the subjects see the emotional video. In addition, we extract features from the S-transform of ECG. Using Eye Gaze signals(pupil diameter, Gaze distance, Eye blinking), we extract 12 features. The classifier is trained using Random Forest. The classifiers detect the smartphone addiction using the ECG and Eye Gaze signals. We compared the detection results with S-measure results that surveyed before test. It showed 87.89% accuracy in ECG and 60.25% accuracy in Eye Gaze.

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

A Study on the Hangul Input Methodology for Eye-gaze Interface (시선 입력 장치에 의한 한글 입력 시스템 설계에 관한 연구)

  • Seo Han-Sok;Kim Chee-Yong
    • Journal of Digital Contents Society
    • /
    • v.5 no.3
    • /
    • pp.239-244
    • /
    • 2004
  • New developments in IT already impact wide segments of a young and mobile population. It is evident that applications of information technology can be of equal benefit to the aged and the disabled. `Eye-Gaze'(EGI) technology was designed for people with paralysis in the upper body. There is a compeling need for a dedicated Korean Language interface for this system. The purpose of this study is to research 'Barrier Free' software using a control group of the mobility impaired to assess the Eye-Gaze Interface in the context of more conventional input methods. TheEGI of this study uses Quick Glance System of Eye Tech Digital Systems. The study will be evaluated on criteria based upon the needs of those with specific disabilities and mobility problems associated with aging. We also intend to explore applications of the Eye-Gaze Interface for English and Japanese devises, based upon our study using the Hangul phonology.

  • PDF

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

Adaptive Zoom-based Gaze Tracking for Enhanced Accuracy and Precision (정확도 및 정밀도 향상을 위한 적응형 확대 기반의 시선 추적 기법)

  • Song, Hyunjoo;Jo, Jaemin;Kim, Bohyoung;Seo, Jinwook
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.9
    • /
    • pp.610-615
    • /
    • 2015
  • The accuracy and precision of video-based remote gaze trackers is affected by numerous factors (e.g. the head movement of the participant). However, it is challenging to control all factors that have an influence, and doing so (e.g., using a chin-rest to control geometry) could lead to losing the benefit of using gaze trackers, i.e., the ecological validity of their unobtrusive nature. We propose an adaptive zoom-based gaze tracking technique, ZoomTrack that addresses this problem by improving the resolution of the gaze tracking results. Our approach magnifies a region-of-interest (ROI) and retrieves gaze points at a higher resolution under two different zooming modes: only when the gaze reaches the ROI (temporary) or whenever a participant stares at the stimuli (omnipresent). We compared these against the base case without magnification in a user study. The results are then used to summarize the advantages and limitations of our technique.