• Title/Summary/Keyword: Eye gaze information

Search Result 100, Processing Time 0.029 seconds

Gaze Detection in Head Mounted Camera environment (Head Mounted Camera 환경에서 응시위치 추적)

  • 이철한;이정준;김재희
    • Proceedings of the IEEK Conference
    • /
    • 2000.11d
    • /
    • pp.25-28
    • /
    • 2000
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the computer vision processing. This System can help the handicapped to use a computer, substitute a touch screen which is expensive, and navigate the virtual reality. There are basically two main types of the study of gaze detection. The first is to find out the location by face movement, and the second is by eye movement. In the gaze detection by eye movement, we find out the position with special devices, or the methode of image processing. In this paper, we detect not the iris but the pupil from the image captured by Head-Mounted Camera with infra-red light, and accurately locate the position where a user looking at by A(fine Transform.

  • PDF

Compensation for Fast Head Movements on Non-intrusive Eye Gaze Tracking System Using Kalman Filter (Kalman filter를 이용한 비접촉식 응시점 추정 시스템에서의 빠른 머리 이동의 보정)

  • Kim, Soo-Chan;Yoo, Jae-Ha;Kim, Deok-Won
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.44 no.6
    • /
    • pp.35-41
    • /
    • 2007
  • We proposed an eye gaze tracking system under natural head movements. The system consists of one CCD(charge-coupled device) camera and two front-surface mirrors. The mirrors rotate to follow head movements in order to keep the eye within the view of the camera. However, the mirror controller cannot guarantee the fast head movements, because the frame rate is generally 30Hz. To overcome this problem, we applied Kalman filter to estimate next eye position from the current eye image. In the results, our system allowed the subjects head to move 60cm horizontally and 40cm vertically, with the head movement speed about 55cm/sec and 45cm/sec, respectively. And spatial gate resolutions were about 4.5 degree and 5.0 degree, respectively, and the gaze estimation accuracy was 92% under natural head movements.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Hwang, suen ki;Kim, Moon-Hwan;Cha, Sam;Cho, Eun-Seuk;Bae, Cheol-Soo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.3
    • /
    • pp.61-69
    • /
    • 2009
  • In this paper, to propose a new approach to real-time eye tracking. Existing methods of tracking the user's attention to the little I move my head was not going to get bad results for each of the users needed to perform the calibration process. Infrared eye tracking methods proposed lighting and Generalized Regression Neural Networks (GRNN) By using the calibration process, the movement of the head is large, even without the reliable and accurate eye tracking, mapping function was to enable each of the calibration process by the generalization can be omitted, did not participate in the study eye other users tracking was possible. Experimental results of facial movements that an average 90% of cases, other users on average 85% of the eye tracking results were shown.

  • PDF

Eye Contact System Using Depth Fusion for Immersive Videoconferencing (실감형 화상 회의를 위해 깊이정보 혼합을 사용한 시선 맞춤 시스템)

  • Jang, Woo-Seok;Lee, Mi Suk;Ho, Yo-Sung
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.7
    • /
    • pp.93-99
    • /
    • 2015
  • In this paper, we propose a gaze correction method for realistic video teleconferencing. Typically, cameras used in teleconferencing are installed at the side of the display monitor, but not in the center of the monitor. This system makes it too difficult for users to contact each eyes. Therefore, eys contact is the most important in the immersive videoconferencing. In the proposed method, we use the stereo camera and the depth camera to correct the eye contact. The depth camera is the kinect camera, which is the relatively cheap price, and estimate the depth information efficiently. However, the kinect camera has some inherent disadvantages. Therefore, we fuse the kinect camera with stereo camera to compensate the disadvantages of the kinect camera. Consecutively, for the gaze-corrected image, view synthesis is performed by 3D warping according to the depth information. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu (입 벌림 인식과 팝업 메뉴를 이용한 시선추적 마우스 시스템 성능 개선)

  • Byeon, Ju Yeong;Jung, Keechul
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.12
    • /
    • pp.1454-1463
    • /
    • 2020
  • An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.

The Effect of Gaze Angle on Muscle Activity and Kinematic Variables during Treadmill Walking

  • Kim, Bo-Suk;Jung, Jae-Hu;Chae, Woen-Sik
    • Korean Journal of Applied Biomechanics
    • /
    • v.27 no.1
    • /
    • pp.35-43
    • /
    • 2017
  • Objective: The purpose of this study was to determine how gaze angle affects muscle activity and kinematic variables during treadmill walking and to offer scientific information for effective and safe treadmill training environment. Method: Ten male subjects who have no musculoskeletal disorder were recruited. Eight pairs of surface electrodes were attached to the right side of the body to monitor the upper trapezius (UT), rectus abdominis (RA), erector spinae (ES), rectus femoris (RF), bicep femoris (BF), tibialis anterior (TA), medialis gastrocnemius (MG), and lateral gastrocnemius (LG). Two digital camcorders were used to obtain 3-D kinematics of the lower extremity. Each subject walked on a treadmill with a TV monitor at three different heights (eye level; EL, 20% above eye level; AE, 20% below eye level; BE) at speed of 5.0 km/h. For each trial being analyzed, five critical instants and four phases were identified from the video recording. For each dependent variable, one-way ANOVA with repeated measures was used to determine whether there were significant differences among three different conditions (p<.05). When a significant difference was found, post hoc analyses were performed using the contrast procedure. Results: This study found that average and peak IEMG values for EL were generally smaller than the corresponding values for AE and BE but the differences were not statically significant. There were also no significant changes in kinematic variables among three different gaze angles. Conclusion: Based on the results of this study, gaze angle does not affect muscle activity and kinematic variables during treadmill walking. However, it is interesting to note that walking with BE may increase the muscle activity of the trapezius and the lower extremity. Moreover, it may hinder proper dorsiflexion during landing phase. Thus, it seems to reasonable to suggest that inappropriate gaze angle should be avoided in treadmill walking. It is obvious that increased walking speed may cause a significant changes in biomechanical parameters used in this study. It is recommended that future studies be conducted which are similar to the present investigation but using different walking speed.

Gaze Detection by Computing Facial and Eye Movement (얼굴 및 눈동자 움직임에 의한 시선 위치 추적)

  • 박강령
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.2
    • /
    • pp.79-88
    • /
    • 2004
  • Gaze detection is to locate the position on a monitor screen where a user is looking by computer vision. Gaze detection systems have numerous fields of application. They are applicable to the man-machine interface for helping the handicapped to use computers and the view control in three dimensional simulation programs. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.8 cm of RMS error.

Gaze Detection System by IR-LED based Camera (적외선 조명 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.4C
    • /
    • pp.494-504
    • /
    • 2004
  • The researches about gaze detection have been much developed with many applications. Most previous researches only rely on image processing algorithm, so they take much processing time and have many constraints. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.2 cm of RMS error.