• 제목/요약/키워드: Head-Tracker

검색결과 44건 처리시간 0.025초

STEREOSCOPIC EYE-TRACKING SYSTEM BASED ON A MOVING PARALLAX BARRIER

  • Chae, Ho-Byung;Lee, Gang-Sung;Lee, Seung-Hyun
    • 한국방송∙미디어공학회:학술대회논문집
    • /
    • 한국방송공학회 2009년도 IWAIT
    • /
    • pp.189-192
    • /
    • 2009
  • We present a novel head tracking system for stereoscopic displays that ensures the viewer has a high degree of movement. The tracker is capable of segmenting the viewer from background objects using their relative distance. A depth camera is used to generate a key signal for head tracking application. A method of the moving parallax barrier is also introduced to supplement a disadvantage of the fixed parallax barrier that provides observation at the specific locations.

  • PDF

트래킹 Gaze와 실시간 Eye (Real Time Eye and Gaze Tracking)

  • 조현섭;민진경
    • 한국산학기술학회:학술대회논문집
    • /
    • 한국산학기술학회 2004년도 추계학술대회
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Patriot Tracking Device를 이용한 가상현실 엔진 구현 (Implementation of Virtual Reality Engine Using Patriot Tracking Device)

  • 김은주;이용욱;송창근
    • 한국정보처리학회:학술대회논문집
    • /
    • 한국정보처리학회 2006년도 춘계학술발표대회
    • /
    • pp.143-146
    • /
    • 2006
  • 본 연구는 개인용 PC 에 장착할 수 있는 저가의 가상현실게임 엔진을 설계하고 구현한다. 가상현실 엔진구현에서는 주요한 입출력 장치인 Tracker 와 HMD(Head Mounted Display) 그리고 조이스틱과 마우스의 장착이 필수적이다. 가상현실 엔진을 연동하기 위한 입출력 클래스를 설계하고 입력장치로 마우스와 조이스틱, 출력장치로 HMD 를 장착하였으며 Tracker 의 구현은 상업용 제품인 Polhemus의 Patriot tracker를 이용하였다.

  • PDF

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2004년도 학술대회지
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

영상정보를 이용한 HMD용 실시간 아이트랙커 시스템 (Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display)

  • 노은정;홍진성;방효충
    • 한국항공우주학회지
    • /
    • 제35권6호
    • /
    • pp.539-547
    • /
    • 2007
  • 본 논문은 영상정보를 이용하여 사용자의 눈의 움직임을 통해 응시점을 추적하는 실시간 아이트랙커 시스템 개발에 대한 연구이다. 개발된 시스템은 광학기반의 동공추적 기법을 이용하여 사용자의 눈의 움직임을 추적한다. 광학기반의 방법은 사용자의 눈에 아무런 장애도 일으키지 않고 눈의 위치를 매우 정확하게 측정 할 수 있다는 장점을 가진다. 동공영상을 획득하기 위해 적외선 카메라를 사용하며, 획득한 영상으로부터 정확한 동공영역을 추출하기 위해 적외선 LED를 사용한다. 실시간 영상처리가 가능하게 하기위해 칼만필터를 적용한 동공추적 알고리즘을 개발하고 DSP(Digital Signal Processing) 시스템을 사용하여 동공영상을 획득한다. 실시간 아이트랙커 시스템을 통하여 실시간으로 사용자의 동공움직임을 추적하고 사용자가 바라보는 배경영상에 사용자의 응시점을 나타낸다.

A Time-multiplexed 3d Display Using Steered Exit Pupils

  • Brar, Rajwinder Singh;Surman, Phil;Sexton, Ian;Hopf, Klaus
    • Journal of Information Display
    • /
    • 제11권2호
    • /
    • pp.76-83
    • /
    • 2010
  • This paper presents the multi-user autostereoscopic 3D display system constructed and operated by the authors using the time-multiplexing approach. This prototype has three main advantages over the previous versions developed by the authors: its hardware was simplified as only one optical array is used to create viewing regions in space, a lenticular multiplexing screen is not necessary as images can be produced sequentially on a fast 120Hz LCD with full resolution, and the holographic projector was replaced with a high-frame-rate digital micromirror device (DMD) projector. The whole system in this prototype consists of four major parts: a 120Hz high-frame-rate DMD projector, a 49-element optical array, a 120Hz screen assembly, and a multi-user head tracker. The display images for the left/right eyes are produced alternatively on a 120Hz direct-view LCD and are synchronized with the output of the projector, which acts as a backlight of the LCD. The novel steering optics controlled by the multiuser head tracker system directs the projector output to regions referred to as exit pupils, which are located in the viewers’eyes. The display can be developed in the "hang-on-the-wall"form.

P2P 기반의 UCC 방송에서 협상을 통한 업로드 트래픽의 개선 (Improvement of Upload Traffic through Negotiation in UCC Broadcasting System)

  • 김지훈
    • 디지털산업정보학회논문지
    • /
    • 제10권3호
    • /
    • pp.171-179
    • /
    • 2014
  • Among the P2P based multimedia streaming architecture, multiple chain architecture has advantage in adapting to dynamically changing network topology simply and rapidly, so this architecture is used for UCC broadcasting system. In UCC broadcasting system, general peer involved in DSLAM becomes UCC server rather than broadcasting system that transfers data from ISP servers. Therefore UCC data generated from UCC server peers is transmitted to peers through DSLAM, and this transmission uses uplink bandwidth of DSLAM. In this paper, I propose an efficient management method of DSLAM uplink bandwidths through negotiating tracker and UCC server peer or head peers of DSLAM. I propose the method that tracker restricts a bitrate of uplink stream of UCC servers when used uplink bandwidth of DSLAM exceeds a certain point of maximum uplink bandwidths. I will show the improved performance of proposed scheme rather than general method with respect to the uplink bandwidth of DSLAM by numerical analysis and simulation.

Motion and Structure Estimation Using Fusion of Inertial and Vision Data for Helmet Tracker

  • Heo, Se-Jong;Shin, Ok-Shik;Park, Chan-Gook
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제11권1호
    • /
    • pp.31-40
    • /
    • 2010
  • For weapon cueing and Head-Mounted Display (HMD), it is essential to continuously estimate the motion of the helmet. The problem of estimating and predicting the position and orientation of the helmet is approached by fusing measurements from inertial sensors and stereo vision system. The sensor fusion approach in this paper is based on nonlinear filtering, especially expended Kalman filter(EKF). To reduce the computation time and improve the performance in vision processing, we separate the structure estimation and motion estimation. The structure estimation tracks the features which are the part of helmet model structure in the scene and the motion estimation filter estimates the position and orientation of the helmet. This algorithm is tested with using synthetic and real data. And the results show that the result of sensor fusion is successful.

실시간 눈과 시선 위치 추적 (Real Time Eye and Gaze Tracking)

  • 조현섭;김희숙
    • 한국산학기술학회논문지
    • /
    • 제6권2호
    • /
    • pp.195-201
    • /
    • 2005
  • 본 논문에서는 새로운 실시간 시선 추적 방식을 제안하고자한다. 기존의 시선추적 방식은 사용자가 머리를 조금만 움직여도 잘못된 결과를 얻을 수가 있었고 각각의 사용자에 대하여 교정 과정을 수행할 필요가 있었다. 따라서 제안된 시선 추적 방법은 적외선 조명과 Generalized Regression Neural Networks(GRNN)를 이용함으로써 교정 과정 없이 머리의 움직임이 큰 경우에도 견실하고 정확한 시선 추적을 가능하도록 하였다. GRNN을 사용함으로써 매핑기능은 원활하게 할 수 있었고, 머리의 움직임은 시선 매핑 기능에 의해 적절하게 시선추적에 반영되어 얼굴의 움직임이 있는 경우에도 시선추적이 가능토록 하였고, 매핑 기능을 일반화함으로써 각각의 교정과정을 생략 할 수 있게 하여 학습에 참여하지 않은 다른 사용자도 시선 추적을 가능케 하였다. 실험결과 얼굴의 움직임이 있는 경우에는 평균 90%, 다른 사용자에 대해서는 평균 85%의 시선 추적 결과를 나타내었다.

  • PDF