• Title/Summary/Keyword: Inertial Pose Tracking

Search Result 5, Processing Time 0.017 seconds

Pose Tracking of Moving Sensor using Monocular Camera and IMU Sensor

  • Jung, Sukwoo;Park, Seho;Lee, KyungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.8
    • /
    • pp.3011-3024
    • /
    • 2021
  • Pose estimation of the sensor is important issue in many applications such as robotics, navigation, tracking, and Augmented Reality. This paper proposes visual-inertial integration system appropriate for dynamically moving condition of the sensor. The orientation estimated from Inertial Measurement Unit (IMU) sensor is used to calculate the essential matrix based on the intrinsic parameters of the camera. Using the epipolar geometry, the outliers of the feature point matching are eliminated in the image sequences. The pose of the sensor can be obtained from the feature point matching. The use of IMU sensor can help initially eliminate erroneous point matches in the image of dynamic scene. After the outliers are removed from the feature points, these selected feature points matching relations are used to calculate the precise fundamental matrix. Finally, with the feature point matching relation, the pose of the sensor is estimated. The proposed procedure was implemented and tested, comparing with the existing methods. Experimental results have shown the effectiveness of the technique proposed in this paper.

Hybrid Inertial and Vision-Based Tracking for VR applications (가상 현실 어플리케이션을 위한 관성과 시각기반 하이브리드 트래킹)

  • Gu, Jae-Pil;An, Sang-Cheol;Kim, Hyeong-Gon;Kim, Ik-Jae;Gu, Yeol-Hoe
    • Proceedings of the KIEE Conference
    • /
    • 2003.11b
    • /
    • pp.103-106
    • /
    • 2003
  • In this paper, we present a hybrid inertial and vision-based tracking system for VR applications. One of the most important aspects of VR (Virtual Reality) is providing a correspondence between the physical and virtual world. As a result, accurate and real-time tracking of an object's position and orientation is a prerequisite for many applications in the Virtual Environments. Pure vision-based tracking has low jitter and high accuracy but cannot guarantee real-time pose recovery under all circumstances. Pure inertial tracking has high update rates and full 6DOF recovery but lacks long-term stability due to sensor noise. In order to overcome the individual drawbacks and to build better tracking system, we introduce the fusion of vision-based and inertial tracking. Sensor fusion makes the proposal tracking system robust, fast, accurate, and low jitter and noise. Hybrid tracking is implemented with Kalman Filter that operates in a predictor-corrector manner. Combining bluetooth serial communication module gives the system a full mobility and makes the system affordable, lightweight energy-efficient. and practical. Full 6DOF recovery and the full mobility of proposal system enable the user to interact with mobile device like PDA and provide the user with natural interface.

  • PDF

Design and Implementation of Real-Time Helmet Pose Tracking System (실시간 헬멧자세 추적시스템의 설계 및 구현)

  • Hwang, Sang-Hyun;Chung, Chul-Ju;Kim, Dong-Sung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.44 no.2
    • /
    • pp.123-130
    • /
    • 2016
  • This paper describes the design and implementation scheme of HTS(Helmet Tracking System) providing coincident LOS(Line of Sight) between aircraft and HMD(Helmet Mounted Display) which displays flight and mission information on Pilot helmet. The functionality and performance of HMD system depends on the performance of helmet tracking system. The target of HTS system design is to meet real-time performance and reliability by predicting non-periodic latency and high accuracy performance. To prove an availability of a proposed approach, a robust hybrid scheme with a fusion optical and inertial tracking system are tested through a implemented test-bed. Experimental results show real-time and reliable tracking control in spite of external errors.

Kalman Filter Baded Pose Data Fusion with Optical Traking System and Inertial Navigation System Networks for Image Guided Surgery (영상유도수술을 위한 광학추적 센서 및 관성항법 센서 네트웍의 칼만필터 기반 자세정보 융합)

  • Oh, Hyun Min;Kim, Min Young
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.1
    • /
    • pp.121-126
    • /
    • 2017
  • Tracking system is essential for Image Guided Surgery(IGS). Optical Tracking System(OTS) is widely used to IGS for its high accuracy and easy usage. However, OTS doesn't work when occlusion of marker occurs. In this paper sensor data fusion with OTS and Inertial Navigation System(INS) is proposed to solve this problem. The proposed system improves the accuracy of tracking system by eliminating gaussian error of the sensor and supplements the disadvantages of OTS and IMU through sensor fusion based on Kalman filter. Also, sensor calibration method that improves the accuracy is introduced. The performed experiment verifies the effectualness of the proposed algorithm.

Towards 3D Modeling of Buildings using Mobile Augmented Reality and Aerial Photographs (모바일 증강 현실 및 항공사진을 이용한 건물의 3차원 모델링)

  • Kim, Se-Hwan;Ventura, Jonathan;Chang, Jae-Sik;Lee, Tae-Hee;Hollerer, Tobias
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.46 no.2
    • /
    • pp.84-91
    • /
    • 2009
  • This paper presents an online partial 3D modeling methodology that uses a mobile augmented reality system and aerial photographs, and a tracking methodology that compares the 3D model with a video image. Instead of relying on models which are created in advance, the system generates a 3D model for a real building on the fly by combining frontal and aerial views. A user's initial pose is estimated using an aerial photograph, which is retrieved from a database according to the user's GPS coordinates, and an inertial sensor which measures pitch. We detect edges of the rooftop based on Graph cut, and find edges and a corner of the bottom by minimizing the proposed cost function. To track the user's position and orientation in real-time, feature-based tracking is carried out based on salient points on the edges and the sides of a building the user is keeping in view. We implemented camera pose estimators using both a least squares estimator and an unscented Kalman filter (UKF). We evaluated the speed and accuracy of both approaches, and we demonstrated the usefulness of our computations as important building blocks for an Anywhere Augmentation scenario.