• Title/Summary/Keyword: Optical Pose Tracking

Search Result 15, Processing Time 0.016 seconds

Fast Natural Feature Tracking Using Optical Flow (광류를 사용한 빠른 자연특징 추적)

  • Bae, Byung-Jo;Park, Jong-Seung
    • The KIPS Transactions:PartB
    • /
    • v.17B no.5
    • /
    • pp.345-354
    • /
    • 2010
  • Visual tracking techniques for Augmented Reality are classified as either a marker tracking approach or a natural feature tracking approach. Marker-based tracking algorithms can be efficiently implemented sufficient to work in real-time on mobile devices. On the other hand, natural feature tracking methods require a lot of computationally expensive procedures. Most previous natural feature tracking methods include heavy feature extraction and pattern matching procedures for each of the input image frame. It is difficult to implement real-time augmented reality applications including the capability of natural feature tracking on low performance devices. The required computational time cost is also in proportion to the number of patterns to be matched. To speed up the natural feature tracking process, we propose a novel fast tracking method based on optical flow. We implemented the proposed method on mobile devices to run in real-time and be appropriately used with mobile augmented reality applications. Moreover, during tracking, we keep up the total number of feature points by inserting new feature points proportional to the number of vanished feature points. Experimental results showed that the proposed method reduces the computational cost and also stabilizes the camera pose estimation results.

3D Facial Animation with Head Motion Estimation and Facial Expression Cloning (얼굴 모션 추정과 표정 복제에 의한 3차원 얼굴 애니메이션)

  • Kwon, Oh-Ryun;Chun, Jun-Chul
    • The KIPS Transactions:PartB
    • /
    • v.14B no.4
    • /
    • pp.311-320
    • /
    • 2007
  • This paper presents vision-based 3D facial expression animation technique and system which provide the robust 3D head pose estimation and real-time facial expression control. Many researches of 3D face animation have been done for the facial expression control itself rather than focusing on 3D head motion tracking. However, the head motion tracking is one of critical issues to be solved for developing realistic facial animation. In this research, we developed an integrated animation system that includes 3D head motion tracking and facial expression control at the same time. The proposed system consists of three major phases: face detection, 3D head motion tracking, and facial expression control. For face detection, with the non-parametric HT skin color model and template matching, we can detect the facial region efficiently from video frame. For 3D head motion tracking, we exploit the cylindrical head model that is projected to the initial head motion template. Given an initial reference template of the face image and the corresponding head motion, the cylindrical head model is created and the foil head motion is traced based on the optical flow method. For the facial expression cloning we utilize the feature-based method, The major facial feature points are detected by the geometry of information of the face with template matching and traced by optical flow. Since the locations of varying feature points are composed of head motion and facial expression information, the animation parameters which describe the variation of the facial features are acquired from geometrically transformed frontal head pose image. Finally, the facial expression cloning is done by two fitting process. The control points of the 3D model are varied applying the animation parameters to the face model, and the non-feature points around the control points are changed by use of Radial Basis Function(RBF). From the experiment, we can prove that the developed vision-based animation system can create realistic facial animation with robust head pose estimation and facial variation from input video image.

Realtime Markerless 3D Object Tracking for Augmented Reality (증강현실을 위한 실시간 마커리스 3차원 객체 추적)

  • Min, Jae-Hong;Islam, Mohammad Khairul;Paul, Anjan Kumar;Baek, Joong-Hwan
    • Journal of Advanced Navigation Technology
    • /
    • v.14 no.2
    • /
    • pp.272-277
    • /
    • 2010
  • AR(Augmented Reality) needs medium between real and virtual, world, and recognition techniques are necessary to track an object continuously. Optical tracking using marker is mainly used, but it takes time and is inconvenient to attach marker onto the target objects. Therefore, many researchers try to develop markerless tracking techniques nowaday. In this paper, we extract features and 3D position from 3D objects and suggest realtime tracking based on these features and positions, which do not use just coplanar features and 2D position. We extract features using SURF, get rotation matrix and translation vector of 3D object using POSIT with these features and track the object in real time. If the extracted features are nor enough and it fail to track the object, then new features are extracted and re-matched to recover the tracking. Also, we get rotation in matrix and translation vector of 3D object using POSIT and track the object in real time.

Design of a Background Image Based Multi-Degree-of-Freedom Pointing Device (배경영상 기반 다자유도 포인팅 디바이스의 설계)

  • Jang, Suk-Yoon;Kho, Jae-Won
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.6
    • /
    • pp.133-141
    • /
    • 2008
  • As interactive multimedia have come into wide use, user interfaces such as remote controllers or classical computer mice have several limitations that cause inconvenience. We propose a vision-based pointing device to resolve this problem. We analyzed the moving image from the camera which is embedded in the pointing device and estimate the movement of the device. The pose of the cursor can be determined from this result. To process in the real time, we used the low resolution of $288{\times}208$ pixel camera and comer points of the screen were tracked using local optical flow method. The distance from screen and device was calculated from the size of screen in the image. The proposed device has simple configurations, low cost, easy use, and intuitive handhold operation like traditional mice. Moreover it shows reliable performance even in the dark condition.

Implementation of a Helmet Azimuth Tracking System in the Vehicle (이동체 내의 헬멧 방위각 추적 시스템 구현)

  • Lee, Ji-Hoon;Chung, Hae
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.24 no.4
    • /
    • pp.529-535
    • /
    • 2020
  • It is important to secure the driver's external field view in armored vehicles surrounded by iron armor for preparation for the enemy's firepower. For this purpose, a 360 degree rotatable surveillance camera is mounted on the vehicle. In this case, the key idea is to recognize the head of the driver wearing a helmet so that the external camera rotated in exactly the same direction. In this paper, we introduce a method that uses a MEMS-based AHRS sensor and a illuminance sensor to compensate for the disadvantages of the existing optical method and implements it with low cost. The key idea is to set the direction of the camera by using the difference between the Euler angles detected by two sensors mounted on the camera and the helmet, and to adjust the direction with illuminance sensor from time to time to remove the drift error of sensors. The implemented prototype will show the camera's direction matches exactly in driver's one.