• 제목/요약/키워드: 시선추적방법

Search Result 98, Processing Time 0.137 seconds

Pupil Center Detection Method using Boundary Distortion Correction (경계선 왜곡 보정을 통한 동공중심 검출 방법)

  • Lee, Injae;Cho, Chul Woo;Lee, Hyeon Chang;Lee, Heekyung;Park, Kang Ryoung;Cha, Jihun
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2013.06a
    • /
    • pp.70-72
    • /
    • 2013
  • 시선추적 인터페이스는 다른 감각기관에 비해 빠른 반응속도를 보이므로 효과적인 인터랙션 수단으로 활용가능하며, 이를 통해 사용자 경험을 향상시킬 수 있다. 따라서 시선추적 기술은 장애인 안구마우스, 운전자 시선정보 분석, 광고 효과 모니터링, 차세대 게임 등 다양한 분야에 활용될 수 있다. 본 논문에서는 시선추적 인터페이스의 정확도를 향상시키기 위한 기술을 소개한다. 실제 사용자를 대상으로 시선추적을 수행할 때 눈꺼풀로 인한 동공 가림 현상이 자주 발생하며, 각막반사광으로 인한 동공 경계 왜곡 현상이 발생하기도 한다. 이로 인해 동공 중심이 왜곡되어 시선위치에 오류가 발생한다. 이와 같은 문제점을 개선하기 위해 눈꺼풀 및 각막반사광으로 인한 왜곡 현상을 보정하여 동공 중심을 검출하는 방법을 제안한다.

  • PDF

Gaze Recognition Interface Development for Smart Wheelchair (지능형 휠체어를 위한 시선 인식 인터페이스 개발)

  • Park, S.H.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • v.5 no.1
    • /
    • pp.103-110
    • /
    • 2011
  • In this paper, we propose a gaze recognition interface for smart wheelchair. The gaze recognition interface is a user interface which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. Smart wheelchair is composed of gaze recognition and tracking module, user interface module, obstacle detector, motor control module, and range sensor module. The interface in this paper uses a camera with built-in infra red filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The results of the experiment showed that the proposed interface can control the system exactly by recognizing user's gaze direction.

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

A study on the relationship between gaze guidance and cybersickness using Eyetracking (시선 추적기법을 활용한 시선 유도와 사이버 멀미 관계 연구)

  • Lee, TaeGu;Ahn, ChanJe
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.3
    • /
    • pp.167-173
    • /
    • 2022
  • The size of the virtual reality market is growing every year, but cybersickness that occurs in virtual reality has not been resolved yet. In this paper, results were derived through experiments on the relationship between cybersickness and gaze guidance occurring in virtual reality contents. Using eye tracking technique, the relationship of gaze movement with cybersickness was identified. The experiment was divided into two groups to find out whether visual induction affects cyber sickness. In addition, the results were analyzed by dividing the two groups to check whether cyber sickness showed different results according to gender. We also analyzed using the SSQ questionnaire to measure cybersickness. We tried to understand the relationship between gaze guidance and cybersickness through two methods. As a result of the experiment, it was found that the induction of a clear gaze caused the concentration of the gaze, and it was effective in cybersickness through the rotation of the camera. In order to alleviate cyber sickness, it has been confirmed that concentrating one's eyes through gaze-guided production is effective for cyber sickness. It is hoped that this result will be used as a way to alleviate cyber sickness for producers who want to use virtual reality to produce content.

A Study on Eye Tracking Techniques using Wearable Devices (웨어러블향(向) 시선추적 기법에 관한 연구)

  • Jaehyuck Jang;Jiu Jung;Junghoon Park
    • Smart Media Journal
    • /
    • v.12 no.3
    • /
    • pp.19-29
    • /
    • 2023
  • The eye tracking technology is widespread all around the society, and is demonstrating great performances in both preciseness and convenience. Hereby we can glimpse new possibility of an interface's conduct without screen-touching. This technology can become a new way of conversation for those including but not limited to the patients suffering from Lou Gehrig's disease, who are paralyzed each part by part of the body and finally cannot help but only moving eyes. Formerly in that case, the patients were given nothing to do but waiting for the death, even being unable to communicate with there families. A new interface that harnesses eyes as a new means of communication, although it conveys great difficulty, can be helpful for them. There surely are some eye tracking systems and equipment for their exclusive uses on the market. Notwithstanding, several obstacles including the complexity of operation and their high prices of over 12 million won($9,300) are hindering universal supply to people and coverage for the patients. Therefore, this paper suggests wearable-type eye tracking device that can support minorities and vulnerable people and be occupied inexpensively and study eye tracking method in order to maximize the possibility of future development across the world, finally proposing the way of designing and developing a brought-down costed eye tracking system based on high-efficient wearable device.

Robust Gaze-Fixing of an Active Vision System under Variation of System Parameters (시스템 파라미터의 변동 하에서도 강건한 능동적인 비전의 시선 고정)

  • Han, Youngmo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.195-200
    • /
    • 2012
  • To steer a camera is done based on system parameters of the vision system. However, the system parameters when they are used might be different from those when they were measured. As one method to compensate for this problem, this research proposes a gaze-steering method based on LMI(Linear Matrix Inequality) that is robust to variations in the system parameters of the vision system. Simulation results show that the proposed method produces less gaze-tracking error than a contemporary linear method and more stable gaze-tracking error than a contemporary nonlinear method. Moreover, the proposed method is fast enough for realtime processing.

Gaze Detection System using Real-time Active Vision Camera (실시간 능동 비전 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.12
    • /
    • pp.1228-1238
    • /
    • 2003
  • This paper presents a new and practical method based on computer vision for detecting the monitor position where the user is looking. In general, the user tends to move both his face and eyes in order to gaze at certain monitor position. Previous researches use only one wide view camera, which can capture a whole user's face. In such a case, the image resolution is too low and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with dual camera systems(a wide and a narrow view camera). In order to locate the user's eye position accurately, the narrow view camera has the functionalities of auto focusing and auto panning/tilting based on the detected 3D facial feature positions from the wide view camera. In addition, we use dual R-LED illuminators in order to detect facial features and especially eye features. As experimental results, we can implement the real-time gaze detection system and the gaze position accuracy between the computed positions and the real ones is about 3.44 cm of RMS error.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Digital Library Interface Research Based on EEG, Eye-Tracking, and Artificial Intelligence Technologies: Focusing on the Utilization of Implicit Relevance Feedback (뇌파, 시선추적 및 인공지능 기술에 기반한 디지털 도서관 인터페이스 연구: 암묵적 적합성 피드백 활용을 중심으로)

  • Hyun-Hee Kim;Yong-Ho Kim
    • Journal of the Korean Society for information Management
    • /
    • v.41 no.1
    • /
    • pp.261-282
    • /
    • 2024
  • This study proposed and evaluated electroencephalography (EEG)-based and eye-tracking-based methods to determine relevance by utilizing users' implicit relevance feedback while navigating content in a digital library. For this, EEG/eye-tracking experiments were conducted on 32 participants using video, image, and text data. To assess the usefulness of the proposed methods, deep learning-based artificial intelligence (AI) techniques were used as a competitive benchmark. The evaluation results showed that EEG component-based methods (av_P600 and f_P3b components) demonstrated high classification accuracy in selecting relevant videos and images (faces/emotions). In contrast, AI-based methods, specifically object recognition and natural language processing, showed high classification accuracy for selecting images (objects) and texts (newspaper articles). Finally, guidelines for implementing a digital library interface based on EEG, eye-tracking, and artificial intelligence technologies have been proposed. Specifically, a system model based on implicit relevance feedback has been presented. Moreover, to enhance classification accuracy, methods suitable for each media type have been suggested, including EEG-based, eye-tracking-based, and AI-based approaches.

Gaze Detection System by IR-LED based Camera (적외선 조명 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.4C
    • /
    • pp.494-504
    • /
    • 2004
  • The researches about gaze detection have been much developed with many applications. Most previous researches only rely on image processing algorithm, so they take much processing time and have many constraints. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.2 cm of RMS error.