• 제목/요약/키워드: eye-tracking system

Search Result 172, Processing Time 0.029 seconds

Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display (영상정보를 이용한 HMD용 실시간 아이트랙커 시스템)

  • Roh, Eun-Jung;Hong, Jin-Sung;Bang, Hyo-Choong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.35 no.6
    • /
    • pp.539-547
    • /
    • 2007
  • In this paper, development and tests of a real-time eye-tracker system are discussed. The tracker system tracks a user's gaze point through movement of eyes by means of vision-based pupil detection. The vision-based method has an advantage of detecting the exact positions of user's eyes. An infrared camera and a LED are used to acquire a user's pupil image and to extract pupil region, which was hard to extract with software only, from the obtained image, respectively. We develop a pupil-tracking algorithm with Kalman filter and grab the pupil images by using DSP(Digital Signal Processing) system for real-time image processing technique. The real-time eye-tracker system tracks the movements of user's pupils to project their gaze point onto a background image.

A Computer Access System for the Physically Disabled Using Eye-Tracking and Speech Recognition (아이트래킹 및 음성인식 기술을 활용한 지체장애인 컴퓨터 접근 시스템)

  • Kwak, Seongeun;Kim, Isaac;Sim, Debora;Lee, Seung Hwan;Hwang, Sung Soo
    • Journal of the HCI Society of Korea
    • /
    • v.12 no.4
    • /
    • pp.5-15
    • /
    • 2017
  • Alternative computer access devices are one of the ways for the physically disabled to meet their desire to participate in social activities. Most of these devices provide access to computers by using their feet or heads. However, it is not easy to control the mouse by using their feet, head, etc. with physical disabilities. In this paper, we propose a computer access system for the physically disabled. The proposed system can move the mouse only by the user's gaze using the eye-tracking technology. The mouse can be clicked through the external button which is relatively easy to press, and the character can be inputted easily and quickly through the voice recognition. It also provides detailed functions such as mouse right-click, double-click, drag function, on-screen keyboard function, internet function, scroll function, etc.

Tracking of eyes based on the iterated spatial moment using weighted gray level (명암 가중치를 이용한 반복 수렴 공간 모멘트기반 눈동자의 시선 추적)

  • Choi, Woo-Sung;Lee, Kyu-Won
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.5
    • /
    • pp.1240-1250
    • /
    • 2010
  • In this paper, an eye tracking method is presented by using on iterated spatial moment adapting weighted gray level that can accurately detect and track user's eyes under the complicated background. The region of face is detected by using Haar-like feature before extracting region of eyes to minimize an region of interest from the input picture of CCD camera. And the region of eyes is detected by using eigeneye based on the eigenface of Principal component analysis. Also, feature points of eyes are detected from darkest part in the region of eyes. The tracking of eyes is achieved correctly by using iterated spatial moment adapting weighted gray level.

Tracking of eyes based on the spatial moment using weighted gray level (명암 가중치를 이용한 공간 모멘트기반 눈동자 추적)

  • Choi, Woo-Sung;Lee, Kyu-Won;Kim, Kwan-Seop
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2009.10a
    • /
    • pp.198-201
    • /
    • 2009
  • In this paper, an eye tracking method is presented by using on iterated spatial moment adapting weighted gray level that can accurately detect and track user's eyes under the complicated background. The region of face is detected by using Haar-like feature before extracting region of eyes to minimize an region of interest from the input picture of CCD camera. And the region of eyes is detected by using eigeneye based on the eigenface of Principal component analysis. And then feature points of eyes are detected from darkest part in the region of eyes. The tracking of eyes is achieved correctly by using iterated spatial moment adapting weighted gray level.

  • PDF

Analysis of Eye Movement by the Science Achievement Level of the Elementary Students on Observation Test (관찰 문제에서 초등학생의 과학 학업성취도에 따른 안구운동 분석)

  • Shin, Won-Sub;Shin, Donghoon
    • Journal of Korean Elementary Science Education
    • /
    • v.32 no.2
    • /
    • pp.185-197
    • /
    • 2013
  • The purpose of this study was to analyze the difference between eye movements according to science achievement of elementary school students in observation situation. Science achievement was based on the results of national achievement test conducted in 2012, a random sampling of classes. As an assessment tool to check observation test, two observation measure problems from TSPS (Test of Science Process Skill; developed in 1994) suitable for eye tracking system are adopted. The subjects of this study were twenty students of sixth grade who agreed to participate in the research. SMI (SensoMotoric Instruments)' iView $X^{TM}$ RED was used to collect eye movement data and Experiment 3.1 and BeGaze 3.1 program were used to plan and analyze experiment. As a result, eye movements in observation test varied greatly in fixation duration, frequency, saccade, saccade velocity and eye blink according to students' science achievement. Based on the result of eye movements analysis, heuristic search eye movement was discussed as an alternative to improve underachievers' science achievement.

Dynamic tracking control of robot manipulators using vision system (비전 시스템을 이용한 로봇 머니퓰레이터의 동력학 추적 제어)

  • 한웅기;국태용
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1816-1819
    • /
    • 1997
  • Using the vision system, robotic tasks in unstructured environments can be accompished, which reduces greatly the cost and steup time for the robotic system to fit to he well-defined and structured working environments. This paper proposes a dynamic control scheme for robot manipulator with eye-in-hand camera configuration. To perfom the tasks defined in the image plane, the camera motion Jacobian (image Jacobian) matrix is used to transform the camera motion to the objection position change. In addition, the dynamic learning controller is designed to improve the tracking performance of robotic system. the proposed control scheme is implemented for tasks of tracking moving objects and shown to outperform the conventional visual servo system in convergence and robustness to parameter uncertainty, disturbances, low sampling rate, etc.

  • PDF

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.

Human Spatial Cognition Using Visual and Auditory Stimulation

  • Yu, Mi;Piao, Yong-Jun;Kim, Yong-Yook;Kwon, Tae-Kyu;Hong, Chul-Un;Kim, Nam-Gyun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.7 no.2
    • /
    • pp.41-45
    • /
    • 2006
  • This paper deals with human spatial cognition using visual and auditory stimulation. More specially, this investigation is to observe the relationship between the head and the eye motor system for the localization of visual target direction in space and to try to describe what is the role of right-side versus left-side pinna. In the experiment of visual stimulation, nineteen red LEDs (Luminescent Diodes, Brightness: $210\;cd/^2$) arrayed in the horizontal plane of the surrounding panel are used. Here the LEDs are located 10 degrees apart from each other. Physiological parameters such as EOG (Electro-Oculography), head movement, and their synergic control are measured by BIOPAC system and 3SPACE FASTRAK. In the experiment of auditory stimulation, one side of the pinna function was distorted intentionally by inserting a short tube in the ear canal. The localization error caused by right and left side pinna distortion was investigated as well. Since a laser pointer showed much less error (0.5%) in localizing target position than FASTRAK (30%) that has been generally used, a laser pointer was used for the pointing task. It was found that harmonic components were not essential for auditory target localization. However, non-harmonic nearby frequency components was found to be more important in localizing the target direction of sound. We have found that the right pinna carries out one of the most important functions in localizing target direction and pure tone with only one frequency component is confusing to be localized. It was also found that the latency time is shorter in self moved tracking (SMT) than eye alone tracking (EAT) and eye hand tracking (EHT). These results can be used in further study on the characterization of human spatial cognition.

Investigation of the visual search patterns of the cockpit displays for the ergonomic cockpit design (인간공학적 조종실 설계를 위한 계기 탐색 형태에 관한 연구)

  • Song Young-Woong;Lee Jong-Seon
    • Journal of the Korea Safety Management & Science
    • /
    • v.8 no.2
    • /
    • pp.71-80
    • /
    • 2006
  • There are many display panels in the flight cockpit and pilots get various flight information from those displays. The ergonomic layout of the displays must be determined based upon frequency of use and sequence of use. This study investigated the visual search patterns of the six display groups(one head-up-display: HUD, two multi function displays: MFDs, one engine group: EG, one flight display group: FD and others) in a fighting aircraft. Four expert pilots conducted Imaginary flight in the physical mock-up and the eye movements were collected using eye tracking system. Data of dwell time, frequency of use, and eye movement path were collected. Pilots spent most of time on HUD(55.2%), and others (21.6%), FD(14.2%), right MFD(4.7%), EG(3.2%), and left MFD(1.1%) in descending order. Similarly HUD(42.8%) and others(30.0%) were the most frequently visited displays. These data can be used in the layout of cockpit displays and the determination of optimal visual search pattern.

Method for Inference of Operators' Thoughts from Eye Movement Data in Nuclear Power Plants

  • Ha, Jun Su;Byon, Young-Ji;Baek, Joonsang;Seong, Poong Hyun
    • Nuclear Engineering and Technology
    • /
    • v.48 no.1
    • /
    • pp.129-143
    • /
    • 2016
  • Sometimes, we need or try to figure out somebody's thoughts from his or her behaviors such as eye movement, facial expression, gestures, and motions. In safety-critical and complex systems such as nuclear power plants, the inference of operators' thoughts (understanding or diagnosis of a current situation) might provide a lot of opportunities for useful applications, such as development of an improved operator training program, a new type of operator support system, and human performance measures for human factor validation. In this experimental study, a novel method for inference of an operator's thoughts from his or her eye movement data is proposed and evaluated with a nuclear power plant simulator. In the experiments, about 80% of operators' thoughts can be inferred correctly using the proposed method.