• Title/Summary/Keyword: eye tracking

Search Result 445, Processing Time 0.027 seconds

Resolution Estimation Technique in Gaze Tracking System for HCI (HCI를 위한 시선추적 시스템에서 분해능의 추정기법)

  • Kim, Ki-Bong;Choi, Hyun-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.1
    • /
    • pp.20-27
    • /
    • 2021
  • Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.

Robust Eye Region Discrimination and Eye Tracking to the Environmental Changes (환경변화에 강인한 눈 영역 분리 및 안구 추적에 관한 연구)

  • Kim, Byoung-Kyun;Lee, Wang-Heon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1171-1176
    • /
    • 2014
  • The eye-tracking [ET] is used on the human computer interaction [HCI] analysing the movement status as well as finding the gaze direction of the eye by tracking pupil's movement on a human face. Nowadays, the ET is widely used not only in market analysis by taking advantage of pupil tracking, but also in grasping intention, and there have been lots of researches on the ET. Although the vision based ET is known as convenient in application point of view, however, not robust in changing environment such as illumination, geometrical rotation, occlusion and scale changes. This paper proposes two steps in the ET, at first, face and eye regions are discriminated by Haar classifier on the face, and then the pupils from the discriminated eye regions are tracked by CAMShift as well as Template matching. We proved the usefulness of the proposed algorithm by lots of real experiments in changing environment such as illumination as well as rotation and scale changes.

Comparison of Eye Movement and Fit Rating Criteria in Judging Pants Fit Between Experts and Novices - Using Eye Tracking Technology - (바지 맞음새 평가 시 전문가와 초보자의 시선추적 및 맞음새 평가 항목의 중요도 비교분석 - Eye Tracking 기법을 이용하여 -)

  • Kim, Youngsook;Song, Hwa Kyung;Jang, Hyowoong
    • Fashion & Textile Research Journal
    • /
    • v.19 no.2
    • /
    • pp.230-239
    • /
    • 2017
  • In the clothes industry, there are lack of experts including technical designers who can analyze the fit of clothes. This study is to provide practical data available for fit analysis education by distinguishing the differences in standards and aspects of garment fit between experts and novices, through the eye-tracking technology to quantify the sense of fit. For this study, two groups were organized; one composed of 7 experts with over 15 year-experience including technical designers and patternmakers, and the other composed of 7 novices who are students majoring in clothing. Wearing the goggle type eye-tracker Tobii Pro Glasses 2, the participants in the experiments were required to conduct fit analyses for a pair of pants on a live model. After those experiments, they were required to check the items for fit analysis and assess the importance level of them on a questionnaire. The differences between the two groups in the ratios of total visit count and total visit duration by each AIO(Area of Interest) of clothes were analyzed through non-parametric statistical test. The results of eye tracking experiments showed that experts focused on center front and back line, crotch area, and side seam, while novice's fixation points were dispersed around the pants. The survey results showed that the experts put importance on the center line position and its verticality, front-back proportion of side seam line, and front-back proportion of waist line, 71.4~100% of whom checked them, while 14.3% of the novices checked them.

Technology Development for Non-Contact Interface of Multi-Region Classifier based on Context-Aware (상황 인식 기반 다중 영역 분류기 비접촉 인터페이스기술 개발)

  • Jin, Songguo;Rhee, Phill-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.6
    • /
    • pp.175-182
    • /
    • 2020
  • The non-contact eye tracking is a nonintrusive human-computer interface providing hands-free communications for people with severe disabilities. Recently. it is expected to do an important role in non-contact systems due to the recent coronavirus COVID-19, etc. This paper proposes a novel approach for an eye mouse using an eye tracking method based on a context-aware based AdaBoost multi-region classifier and ASSL algorithm. The conventional AdaBoost algorithm, however, cannot provide sufficiently reliable performance in face tracking for eye cursor pointing estimation, because it cannot take advantage of the spatial context relations among facial features. Therefore, we propose the eye-region context based AdaBoost multiple classifier for the efficient non-contact gaze tracking and mouse implementation. The proposed method detects, tracks, and aggregates various eye features to evaluate the gaze and adjusts active and semi-supervised learning based on the on-screen cursor. The proposed system has been successfully employed in eye location, and it can also be used to detect and track eye features. This system controls the computer cursor along the user's gaze and it was postprocessing by applying Gaussian modeling to prevent shaking during the real-time tracking using Kalman filter. In this system, target objects were randomly generated and the eye tracking performance was analyzed according to the Fits law in real time. It is expected that the utilization of non-contact interfaces.

Research on Virtual Simulator Sickness Using Field of View Restrictor According to Human Factor levels (FOV Restrictor를 활용한 가상 멀미 저감 요소 기술연구)

  • Kim, Chang-seop;Kim, So-Yeon;Kim, Kwanguk
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.3
    • /
    • pp.49-59
    • /
    • 2018
  • Simulator sickness is one of the important side effect of virtual reality. Simulator sickness is influenced by various factors, and field of view (FOV) is one of them. The FOV is a viewing angle limited by the screen, and when the FOV is reduced, the simulator sickness is reduced, and the presence is lowered. Previous study developed a Dynamic FOV Restrictor (Center-fixed FOV Restrictor) to reduce simulator sickness while maintaining presence. It is a method that limits the FOV dynamically by reflecting the speed and angular velocity of the avatar. We also developed Eye-tracking Based Dynamic FOV Restrictor (Eye-tracking FOV Restrictor) by adding head rotations and eye movements. This study attempts to compare the simulator sickness and the presence of the No FOV Restrictor condition, the Center-fixed FOV Restrictor condition, and the Eye-tracking FOV Restrictor condition. The results showed that the simulator sickness of the Center-fixed FOV Restrictor condition is significantly lower than other two conditions. The results also showed that there were no significant differences in presence in three conditions. The interpretations and limitations of this study are discussed in this paper.

A Study on Eye Tracking Techniques using Wearable Devices (웨어러블향(向) 시선추적 기법에 관한 연구)

  • Jaehyuck Jang;Jiu Jung;Junghoon Park
    • Smart Media Journal
    • /
    • v.12 no.3
    • /
    • pp.19-29
    • /
    • 2023
  • The eye tracking technology is widespread all around the society, and is demonstrating great performances in both preciseness and convenience. Hereby we can glimpse new possibility of an interface's conduct without screen-touching. This technology can become a new way of conversation for those including but not limited to the patients suffering from Lou Gehrig's disease, who are paralyzed each part by part of the body and finally cannot help but only moving eyes. Formerly in that case, the patients were given nothing to do but waiting for the death, even being unable to communicate with there families. A new interface that harnesses eyes as a new means of communication, although it conveys great difficulty, can be helpful for them. There surely are some eye tracking systems and equipment for their exclusive uses on the market. Notwithstanding, several obstacles including the complexity of operation and their high prices of over 12 million won($9,300) are hindering universal supply to people and coverage for the patients. Therefore, this paper suggests wearable-type eye tracking device that can support minorities and vulnerable people and be occupied inexpensively and study eye tracking method in order to maximize the possibility of future development across the world, finally proposing the way of designing and developing a brought-down costed eye tracking system based on high-efficient wearable device.

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

A Study on the Differences in Cognition of Design Associated with Changes in Fashion Model Type - Exploratory Analysis Using Eye Tracking - (패션 모델 유형 변화에 따른 디자인 인지 차이에 관한 연구 - 시선추적을 활용한 탐색적 분석 -)

  • Lee, Shin-Young
    • Fashion & Textile Research Journal
    • /
    • v.20 no.2
    • /
    • pp.167-176
    • /
    • 2018
  • In this study, an eye-tracking program that can confirm a design cognition process was developed for the purpose of presenting strategic methods to create fashion images, and the program was used to identify what effects fashion models' external characteristics have on the cognition of design. The data for analysis were collected through an eyemovement tracking experiment and a survey, with the focus on the research problem that differences in models' external uniformity will lead to differences in the eye movement for perceiving models and design as well as the image sensibility. The results of the analysis are as follows. First, it was confirmed that the uniformity of model types and the simplicity/complexity of design led to differences in the eye movement directed at design and models and the gaze ratio. Consequently, it is deemed that models should be selected in consideration of the characteristics of design and the intention of planning when creating fashion images. Second, it was found that in terms of the cognition of design, external conditions of models affect design sensibility. A change in models led to a subtle difference in sensibility cognition even when the design condition did not change. Thus, not only the design but also model attributes are factors that should be considered important in fashion planning.

Eye Tracking Research on Cinemagraph e-Magazine

  • Park, Ji Seob;Bae, Jin Hwa;Cho, Kwang Su
    • Agribusiness and Information Management
    • /
    • v.7 no.2
    • /
    • pp.1-11
    • /
    • 2015
  • This study has performed a comparative analysis between groups based on Time To First Fixation, Fixation Duration, Fixation Count and Total Visit Duration, which are eye-tracking analysis indicators on what visual attention is shown compared to the e-magazine produced as regular images related to e-magazines in which experiment subjects have applied cinemagraph images as eye tracking research on the e-magazine produced with cinemagraph images and e-magazines produced with regular images. The experiment sample used e-magazines composed of nine pages while AOI (area of interest) has been set up on each page by classifying image and text regions. A combined 30 people took part in the experiment, which was performed by randomly assigning 15 to the experiment group and 15 to the control group. According to the results of the analysis, the experiment group recorded a shorter time than the control group on the e-magazine produced with cinemagraph images through Time To First Fixation. Though no significant difference was found between the experiment and control groups in Fixation Duration, a substantial difference did appear between Fixation Duration and Total Visit Duration.

Real-Time Eye Detection and Tracking Under Various Light Conditions (다양한 조명하에서 실시간 눈 검출 및 추적)

  • 박호식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.456-463
    • /
    • 2004
  • Non-intrusive methods based on active remote IR illumination for eye tracking is important for many applications of vision-based man-machine interaction. One problem that has plagued those methods is their sensitivity to lighting condition change. This tends to significantly limit their scope of application. In this paper, we present a new real-time eye detection and tacking methodology that works under variable and realistic lighting conditions. Based on combining the bright-Pupil effect resulted from IR light and the conventional appearance-based object recognition technique, our method can robustly track eyes when the pupils ale not very bright due to significant external illumination interferences. The appearance model is incorporated in both eyes detection and tacking via the use of support vector machine and the mean shift tracking. Additional improvement is achieved from modifying the image acquisition apparatus including the illuminator and the camera.