• 제목/요약/키워드: eye-position tracking

검색결과 52건 처리시간 0.031초

Center Position Tracking Enhancement of Eyes and Iris on the Facial Image

  • Chai Duck-hyun;Ryu Kwang-ryol
    • Journal of information and communication convergence engineering
    • /
    • 제3권2호
    • /
    • pp.110-113
    • /
    • 2005
  • An enhancement of tracking capacity for the centering position of eye and iris on the facial image is presented. A facial image is acquisitioned with a CCD camera to be converted into a binary image. The eye region to be a specified brightness and shapes is used the FRM method using the neighboring five mask areas, and the iris on the eye is tracked with FPDP method. The experimental result shows that the proposed methods lead the centering position tracking capability to be enhanced than the pixel average coordinate values method.

머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구 (An Efficient Camera Calibration Method for Head Pose Tracking)

  • 박경수;임창주;이경태
    • 대한인간공학회지
    • /
    • 제19권1호
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권4호
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Development of 3D Display System for Video-guide Operation

  • Honda, Toshio;Suzuki, Kou;Kuboshima, Yasuhito;Shiina, Tatsuo
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 한국정보디스플레이학회 2007년도 7th International Meeting on Information Display 제7권2호
    • /
    • pp.1799-1802
    • /
    • 2007
  • In the constructed auto-stereoscopic display system for one observer. 1.stereoscopic images displayed on a special LCD are made on a large concave mirror. 2.The view-zone limiting aperture is set between the projection lens and the concave mirror. 3.The real image of the aperture is made at the observer's eye position by the concave mirror. 4.The observer's eye-position tracking of the view-zone is realized. 5.At same time, stereoscopic image changes automatically according to the eye position of the observer.

  • PDF

헤드/아이 통합 트랙커 개발 및 통합 성능 검증 (Developing Head/Eye Tracking System and Sync Verification)

  • 김정호;이대우;허세종;박찬국;백광열;방효충
    • 제어로봇시스템학회논문지
    • /
    • 제16권1호
    • /
    • pp.90-95
    • /
    • 2010
  • This paper describes the development of integrated head and eye tracker system. Vision based head tracker is performed and it has 7mm error in 300mm translation. The epi-polar method and point matching are used for determining a position of head and rotational degree. High brightness LEDs are installed on helmet and the installed pattern is very important to match the points of stereo system. Eye tracker also uses LED for constant illumination. A Position of gazed object(3m distance) is determined by pupil tracking and eye tracker has 1~5 pixel error. Integration of result data of each tracking system is important. RS-232C communication is applied to integrated system and triggering signal is used for synchronization.

Real Time System Realization for Binocular Eyeball Tracking Screen Cursor

  • Ryu Kwang-Ryol;Chai Duck-Hyun;Sclabassi Robert J.
    • 한국정보통신학회:학술대회논문집
    • /
    • 한국해양정보통신학회 2006년도 춘계종합학술대회
    • /
    • pp.841-846
    • /
    • 2006
  • A real time system realization for binocular eyeball tracking cursor on the computer monitor screen is presented in the paper. The processing for searching iris and tracking the cursor are that a facial is acquired by the small CCD camera, convert it into binary image, search for the eye two using the five region mask method in the eye surroundings and the side four points diagonal positioning method is searched the each iris. The tracking cursor is matched by measuring the iris central moving position, The cursor controlling is achieved by comparing two related distances between the iris maximum moving and the cursor moving to calculate the moving stance from gazing position and screen. The experimental result are obtained by examining some adults person on the system.

  • PDF

GPS와 비전시스템을 이용한 무인 골프카의 자율주행 (Autonomous Traveling of Unmanned Golf-Car using GPS and Vision system)

  • 정병묵;여인주;조지승
    • 한국정밀공학회지
    • /
    • 제26권6호
    • /
    • pp.74-80
    • /
    • 2009
  • Path tracking of unmanned vehicle is a basis of autonomous driving and navigation. For the path tracking, it is very important to find the exact position of a vehicle. GPS is used to get the position of vehicle and a direction sensor and a velocity sensor is used to compensate the position error of GPS. To detect path lines in a road image, the bird's eye view transform is employed, which makes it easy to design a lateral control algorithm simply than from the perspective view of image. Because the driving speed of vehicle should be decreased at a curved lane and crossroads, so we suggest the speed control algorithm used GPS and image data. The control algorithm is simulated and experimented from the basis of expert driver's knowledge data. In the experiments, the results show that bird's eye view transform are good for the steering control and a speed control algorithm also shows a stability in real driving.

Viewing Angle-Improved 3D Integral Imaging Display with Eye Tracking Sensor

  • Hong, Seokmin;Shin, Donghak;Lee, Joon-Jae;Lee, Byung-Gook
    • Journal of information and communication convergence engineering
    • /
    • 제12권4호
    • /
    • pp.208-214
    • /
    • 2014
  • In this paper, in order to solve the problems of a narrow viewing angle and the flip effect in a three-dimensional (3D) integral imaging display, we propose an improved system by using an eye tracking method based on the Kinect sensor. In the proposed method, we introduce two types of calibration processes. First process is to perform the calibration between two cameras within Kinect sensor to collect specific 3D information. Second process is to use a space calibration for the coordinate conversion between the Kinect sensor and the coordinate system of the display panel. Our calibration processes can provide the improved performance of estimation for 3D position of the observer's eyes and generate elemental images in real-time speed based on the estimated position. To show the usefulness of the proposed method, we implement an integral imaging display system using the eye tracking process based on our calibration processes and carry out the preliminary experiments by measuring the viewing angle and flipping effect for the reconstructed 3D images. The experimental results reveal that the proposed method extended the viewing angles and removed the flipping images compared with the conventional system.

영상처리를 이용한 머리의 움직임 추적 시스템 (Head tracking system using image processing)

  • 박경수;임창주;반영환;장필식
    • 대한인간공학회지
    • /
    • 제16권3호
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • 한국컴퓨터정보학회논문지
    • /
    • 제21권10호
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.