• Title/Summary/Keyword: 실시간 시선추적

Search Result 48, Processing Time 0.03 seconds

Development of Eye Protection App using Realtime Eye Tracking and Distance Measurement Method (실시간 시선 추적과 거리 측정 기법을 활용한 눈 보호 앱 개발)

  • Lee, Hye-Ran;Lee, Jun Pyo
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2019.07a
    • /
    • pp.223-224
    • /
    • 2019
  • 본 논문에서는 카메라의 실시간 영상에서 얻을 수 있는 데이터를 수집 및 분석하여 일반인들에게 스마트폰의 실제 사용량, 최적화면 표현, 그리고 건조증 위험도의 정보를 제공하는 "i-eye" 응용 앱을 제안하여 눈 건강관리를 가능하게 한다. 제안하는 앱은 발전된 스마트 폰을 기반으로 동작되며 아이트래킹(eye-gaze tracking), 영상거리측정(image distance measurement), 눈 데이터분석(eye data analysis)의 3가지 핵심기술을 제안한다.

  • PDF

Robust Gaze-Fixing of an Active Vision System under Variation of System Parameters (시스템 파라미터의 변동 하에서도 강건한 능동적인 비전의 시선 고정)

  • Han, Youngmo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.195-200
    • /
    • 2012
  • To steer a camera is done based on system parameters of the vision system. However, the system parameters when they are used might be different from those when they were measured. As one method to compensate for this problem, this research proposes a gaze-steering method based on LMI(Linear Matrix Inequality) that is robust to variations in the system parameters of the vision system. Simulation results show that the proposed method produces less gaze-tracking error than a contemporary linear method and more stable gaze-tracking error than a contemporary nonlinear method. Moreover, the proposed method is fast enough for realtime processing.

Technical Survey on the Real Time Eye-tracking Pointing Device as a Smart Medical Equipment (실시간 시선 추적기반 스마트 의료기기 고찰)

  • Park, Junghoon;Yim, Kangbin
    • Smart Media Journal
    • /
    • v.10 no.1
    • /
    • pp.9-15
    • /
    • 2021
  • The eye tracking system designed in this paper is an eye-based computer input device designed to give an easy access for those who are uncomfortable with Lou Gehrig's or various muscle-related diseases. It is an eye-based-computer-using device for users whose potential demand alone amounts to 30,000. Combining the number of Lou Gehrig's patients in Korea estimated at around 1,700, and those who are unable to move their bodies due to various accidents or diseases. Because these eye input devices are intended for a small group of users, many types of commercial devices are available on the market. It is making them more expensive and difficult to use for these potential users, less accessible. For this reason, each individual's economic situation and individual experience with smart devices are slightly different. Therefore, making it difficult to access them in terms of cost or usability to use a commercial eye tracking system. Accordingly, attempts to improve accessibility to IT devices through low-cost but easy-to-use technologies are essential. Thus, this paper proposes a complementary superior performance eye tracking system that can be conveniently used by far more people and patients by improving the deficiencies of the existing system. Through voluntary VoCs(Voice of Customers) of users who have used different kinds of eye tracking systems that satisfies it through various usability tests, and we propose a reduced system that the amount of calculation to 1/15th, and eye-gaze tracking error rate to 0.5~1 degree under.

A Study on the Characteristics of Consumer Visual-Perceptional Information Acquisition in Commercial Facilities in Regard to its Construction of Space from Real-Time Eye Gaze Tracking (상업시설 공간구성의 실시간 시선추적에 나타난 소비자 시지각 정보획득 특성 연구)

  • Park, Sunmyung
    • Science of Emotion and Sensibility
    • /
    • v.21 no.2
    • /
    • pp.3-14
    • /
    • 2018
  • For satisfying consumer needs, commercial facilities require a variety of sale-related space expressions and eye-catching product arrangements; space composition can also be a direct marketing strategy. The human eye is the sensory organ that acquires the largest amount of information, and an analysis of visual information helps in understanding visual relations between . However, the existing studies are mostly focused on analysis of still frames in experimental images, and there is a lack of studies analyzing gaze information based on mobile images of commercial spaces. Therefore, this study analyzed emotional responses through gaze information of space users in reality using a video of a movement route through a commercial facility. The analysis targeted straight sections of the moving route; based on the data acquired, sectional characteristics of five gaze intensity ranges were examined. As a result, section A, the starting point of the route, had a low gaze intensity, while section B had the highest gaze intensity. This indicates that, starting in section B, the subjects explored the space in a stable way and needed time to adapt to the experimental video. In relation to space characteristics of the gaze-concentrated area, display formats of the right stores in 4 of 6 sections received greater attention. The gaze of consumers was mostly focused on props, and big gaze information was revealed in showcase display formats of the stores. In conclusion, this analysis method can provide highly useful direct design data about merchandise display and merchandise component arrangement based on consumer visual preference.

User-Calibration Free Gaze Tracking System Model (사용자 캘리브레이션이 필요 없는 시선 추적 모델 연구)

  • Ko, Eun-Ji;Kim, Myoung-Jun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1096-1102
    • /
    • 2014
  • In remote gaze tracking system using infra-red LEDs, calibrating the position of reflected light is essential for computing pupil position in captured images. However, there are limitations in reducing errors because variable locations of head and unknown radius of cornea are involved in the calibration process as constants. This study purposes a gaze tracking method based on pupil-corneal reflection that does not require user-calibration. Our goal is to eliminate the correction process of glint positions, which require a prior calibration, so that the gaze calculation is simplified.

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.

Effective real-time identification using Bayesian statistical methods gaze Network (베이지안 통계적 방안 네트워크를 이용한 효과적인 실시간 시선 식별)

  • Kim, Sung-Hong;Seok, Gyeong-Hyu
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.11 no.3
    • /
    • pp.331-338
    • /
    • 2016
  • In this paper, we propose a GRNN(: Generalized Regression Neural Network) algorithms for new eyes and face recognition identification system to solve the points that need corrective action in accordance with the existing problems of facial movements gaze upon it difficult to identify the user and. Using a Kalman filter structural information elements of a face feature to determine the authenticity of the face was estimated future location using the location information of the current head and the treatment time is relatively fast horizontal and vertical elements of the face using a histogram analysis the detected. And the light obtained by configuring the infrared illuminator pupil effects in real-time detection of the pupil, the pupil tracking was - to extract the text print vector.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Gaze Discrimination for Computer Interface (컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Hwang, Suen-Ki;Kim, Moon-Hwan
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.3 no.1
    • /
    • pp.38-46
    • /
    • 2010
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about $5^{\circ}$horizontally and $8^{\circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

  • PDF

Method for Automatic Switching Screen of OST-HMD using Gaze Depth Estimation (시선 깊이 추정 기법을 이용한 OST-HMD 자동 스위칭 방법)

  • Lee, Youngho;Shin, Choonsung
    • Smart Media Journal
    • /
    • v.7 no.1
    • /
    • pp.31-36
    • /
    • 2018
  • In this paper, we propose automatic screen on / off method of OST-HMD screen using gaze depth estimation technique. The proposed method uses MLP (Multi-layer Perceptron) to learn the user's gaze information and the corresponding distance of the object, and inputs the gaze information to estimate the distance. In the learning phase, eye-related features obtained using a wearable eye-tracker. These features are then entered into the Multi-layer Perceptron (MLP) for learning and model generation. In the inference step, eye - related features obtained from the eye tracker in real time input to the MLP to obtain the estimated depth value. Finally, we use the results of this calculation to determine whether to turn the display of the HMD on or off. A prototype was implemented and experiments were conducted to evaluate the feasibility of the proposed method.