• 제목/요약/키워드: Eye Tracking system

검색결과 172건 처리시간 0.027초

영상정보를 이용한 HMD용 실시간 아이트랙커 시스템 (Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display)

  • 노은정;홍진성;방효충
    • 한국항공우주학회지
    • /
    • 제35권6호
    • /
    • pp.539-547
    • /
    • 2007
  • 본 논문은 영상정보를 이용하여 사용자의 눈의 움직임을 통해 응시점을 추적하는 실시간 아이트랙커 시스템 개발에 대한 연구이다. 개발된 시스템은 광학기반의 동공추적 기법을 이용하여 사용자의 눈의 움직임을 추적한다. 광학기반의 방법은 사용자의 눈에 아무런 장애도 일으키지 않고 눈의 위치를 매우 정확하게 측정 할 수 있다는 장점을 가진다. 동공영상을 획득하기 위해 적외선 카메라를 사용하며, 획득한 영상으로부터 정확한 동공영역을 추출하기 위해 적외선 LED를 사용한다. 실시간 영상처리가 가능하게 하기위해 칼만필터를 적용한 동공추적 알고리즘을 개발하고 DSP(Digital Signal Processing) 시스템을 사용하여 동공영상을 획득한다. 실시간 아이트랙커 시스템을 통하여 실시간으로 사용자의 동공움직임을 추적하고 사용자가 바라보는 배경영상에 사용자의 응시점을 나타낸다.

아이트래킹 및 음성인식 기술을 활용한 지체장애인 컴퓨터 접근 시스템 (A Computer Access System for the Physically Disabled Using Eye-Tracking and Speech Recognition)

  • 곽성은;김이삭;심드보라;이승환;황성수
    • 한국HCI학회논문지
    • /
    • 제12권4호
    • /
    • pp.5-15
    • /
    • 2017
  • 컴퓨터 대체접근 기기는 지체장애인의 사회활동 참여에 대한 욕구를 충족시킬 수 있는 방법 중의 하나로 정보통신기술의 발달과 함께 그 필요성이 증대되고 있다. 이러한 기기들은 대부분 발, 머리 등을 이용하여 컴퓨터에 접근할 수 있도록 하는데, 지체장애인의 특성상 발, 머리 등을 이용하여 마우스를 컨트롤하는 것은 쉽지 않으며 속도와 정확도면에서 한계가 있다. 본 논문에서는 기존의 대체접근 기기의 한계를 보완한 '지체장애인 컴퓨터 대체접근시스템'을 제안한다. 제안하는 시스템은 아이트래킹 기술을 이용하여 사용자의 시선만으로 마우스를 이동시킬 수 있고 비교적 누르기 쉬운 외부버튼을 통해 마우스 클릭이 가능하며 음성인식을 통해 문자를 쉽고 빠르게 입력할 수 있다. 또한 마우스 우클릭, 더블클릭, 드래그 기능, 화상 키보드 기능, 인터넷 기능, 스크롤 기능 등 세부적인 기능을 제공하여 컴퓨터가 제공하는 대부분의 작업을 수행할 수 있다.

명암 가중치를 이용한 반복 수렴 공간 모멘트기반 눈동자의 시선 추적 (Tracking of eyes based on the iterated spatial moment using weighted gray level)

  • 최우성;이규원
    • 한국정보통신학회논문지
    • /
    • 제14권5호
    • /
    • pp.1240-1250
    • /
    • 2010
  • 본 논문에서는 명암 가중치를 적용한 반복 공간 모멘트를 이용하여 복잡한 배경에서 사용자의 눈을 정확히 추출하고 추적할 수 있는 눈 추적 시스템을 제안한다. CCD 카메라를 활용하여 촬영한 입력영상으로부터 눈 영역을 찾기 전에 관심영역을 최소화하기 위하여 Haar-like feature를 이용하여 얼굴영역을 검출한다. 그리고 주성분 분석의 고유 얼굴 기반인 고유 눈을 이용하여 눈 영역을 검출 한다. 또한 눈 영역에서 가장 어두운 부분으로부터 눈의 좌 우 상 하 끝점인 특징 점을 찾고, 명암 가중치를 적용한 반복 수렴 공간 모멘트를 이용하여 정확한 눈동자의 시선추적을 확인하였다.

명암 가중치를 이용한 공간 모멘트기반 눈동자 추적 (Tracking of eyes based on the spatial moment using weighted gray level)

  • 최우성;이규원;김관섭
    • 한국정보통신학회:학술대회논문집
    • /
    • 한국해양정보통신학회 2009년도 추계학술대회
    • /
    • pp.198-201
    • /
    • 2009
  • 본 논문에서는 명암 가중치를 적용한 반복 공간 모멘트를 이용하여 복잡한 배경에서 사용자의 눈을 정확히 추출하고 추적할 수 있는 눈 추적 시스템을 제안한다. CCD 카메라를 활용하여 촬영한 입력영상으로부터 눈 영역을 찾기 전에 관심영역을 최소화하기 위하여 Haar-like feature를 이용하여 얼굴영역을 검출한다. 그리고 주성분 분석의 고유 얼굴 기반인 고유 눈을 이용하여 눈 영역을 검출한다. 또한 눈 영역에서 가장 어두운 부분으로부터 눈의 특징 점을 찾고, 명암 가중치를 적용한 반복 수렴 공간 모멘트를 이용하여 정확한 눈동자 추적을 확인하였다.

  • PDF

관찰 문제에서 초등학생의 과학 학업성취도에 따른 안구운동 분석 (Analysis of Eye Movement by the Science Achievement Level of the Elementary Students on Observation Test)

  • 신원섭;신동훈
    • 한국초등과학교육학회지:초등과학교육
    • /
    • 제32권2호
    • /
    • pp.185-197
    • /
    • 2013
  • The purpose of this study was to analyze the difference between eye movements according to science achievement of elementary school students in observation situation. Science achievement was based on the results of national achievement test conducted in 2012, a random sampling of classes. As an assessment tool to check observation test, two observation measure problems from TSPS (Test of Science Process Skill; developed in 1994) suitable for eye tracking system are adopted. The subjects of this study were twenty students of sixth grade who agreed to participate in the research. SMI (SensoMotoric Instruments)' iView $X^{TM}$ RED was used to collect eye movement data and Experiment 3.1 and BeGaze 3.1 program were used to plan and analyze experiment. As a result, eye movements in observation test varied greatly in fixation duration, frequency, saccade, saccade velocity and eye blink according to students' science achievement. Based on the result of eye movements analysis, heuristic search eye movement was discussed as an alternative to improve underachievers' science achievement.

비전 시스템을 이용한 로봇 머니퓰레이터의 동력학 추적 제어 (Dynamic tracking control of robot manipulators using vision system)

  • 한웅기;국태용
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1997년도 한국자동제어학술회의논문집; 한국전력공사 서울연수원; 17-18 Oct. 1997
    • /
    • pp.1816-1819
    • /
    • 1997
  • Using the vision system, robotic tasks in unstructured environments can be accompished, which reduces greatly the cost and steup time for the robotic system to fit to he well-defined and structured working environments. This paper proposes a dynamic control scheme for robot manipulator with eye-in-hand camera configuration. To perfom the tasks defined in the image plane, the camera motion Jacobian (image Jacobian) matrix is used to transform the camera motion to the objection position change. In addition, the dynamic learning controller is designed to improve the tracking performance of robotic system. the proposed control scheme is implemented for tasks of tracking moving objects and shown to outperform the conventional visual servo system in convergence and robustness to parameter uncertainty, disturbances, low sampling rate, etc.

  • PDF

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • 한국인공지능학회지
    • /
    • 제11권1호
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.

Human Spatial Cognition Using Visual and Auditory Stimulation

  • Yu, Mi;Piao, Yong-Jun;Kim, Yong-Yook;Kwon, Tae-Kyu;Hong, Chul-Un;Kim, Nam-Gyun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • 제7권2호
    • /
    • pp.41-45
    • /
    • 2006
  • This paper deals with human spatial cognition using visual and auditory stimulation. More specially, this investigation is to observe the relationship between the head and the eye motor system for the localization of visual target direction in space and to try to describe what is the role of right-side versus left-side pinna. In the experiment of visual stimulation, nineteen red LEDs (Luminescent Diodes, Brightness: $210\;cd/^2$) arrayed in the horizontal plane of the surrounding panel are used. Here the LEDs are located 10 degrees apart from each other. Physiological parameters such as EOG (Electro-Oculography), head movement, and their synergic control are measured by BIOPAC system and 3SPACE FASTRAK. In the experiment of auditory stimulation, one side of the pinna function was distorted intentionally by inserting a short tube in the ear canal. The localization error caused by right and left side pinna distortion was investigated as well. Since a laser pointer showed much less error (0.5%) in localizing target position than FASTRAK (30%) that has been generally used, a laser pointer was used for the pointing task. It was found that harmonic components were not essential for auditory target localization. However, non-harmonic nearby frequency components was found to be more important in localizing the target direction of sound. We have found that the right pinna carries out one of the most important functions in localizing target direction and pure tone with only one frequency component is confusing to be localized. It was also found that the latency time is shorter in self moved tracking (SMT) than eye alone tracking (EAT) and eye hand tracking (EHT). These results can be used in further study on the characterization of human spatial cognition.

인간공학적 조종실 설계를 위한 계기 탐색 형태에 관한 연구 (Investigation of the visual search patterns of the cockpit displays for the ergonomic cockpit design)

  • 송영웅;이종선
    • 대한안전경영과학회지
    • /
    • 제8권2호
    • /
    • pp.71-80
    • /
    • 2006
  • There are many display panels in the flight cockpit and pilots get various flight information from those displays. The ergonomic layout of the displays must be determined based upon frequency of use and sequence of use. This study investigated the visual search patterns of the six display groups(one head-up-display: HUD, two multi function displays: MFDs, one engine group: EG, one flight display group: FD and others) in a fighting aircraft. Four expert pilots conducted Imaginary flight in the physical mock-up and the eye movements were collected using eye tracking system. Data of dwell time, frequency of use, and eye movement path were collected. Pilots spent most of time on HUD(55.2%), and others (21.6%), FD(14.2%), right MFD(4.7%), EG(3.2%), and left MFD(1.1%) in descending order. Similarly HUD(42.8%) and others(30.0%) were the most frequently visited displays. These data can be used in the layout of cockpit displays and the determination of optimal visual search pattern.

Method for Inference of Operators' Thoughts from Eye Movement Data in Nuclear Power Plants

  • Ha, Jun Su;Byon, Young-Ji;Baek, Joonsang;Seong, Poong Hyun
    • Nuclear Engineering and Technology
    • /
    • 제48권1호
    • /
    • pp.129-143
    • /
    • 2016
  • Sometimes, we need or try to figure out somebody's thoughts from his or her behaviors such as eye movement, facial expression, gestures, and motions. In safety-critical and complex systems such as nuclear power plants, the inference of operators' thoughts (understanding or diagnosis of a current situation) might provide a lot of opportunities for useful applications, such as development of an improved operator training program, a new type of operator support system, and human performance measures for human factor validation. In this experimental study, a novel method for inference of an operator's thoughts from his or her eye movement data is proposed and evaluated with a nuclear power plant simulator. In the experiments, about 80% of operators' thoughts can be inferred correctly using the proposed method.