• Title/Summary/Keyword: Panoramic Navigation

Search Result 18, Processing Time 0.033 seconds

The Design of Spatial-Temporal Prediction Filter for saving resources on the view navigation of a panoramic video service (파노라마 영상에서 효율적인 시점탐색을 위한 시공간 비디오 스트림 예측 필터 설계 방법에 관한 연구)

  • Seok, Joo-Myoung;Cho, Yong-Woo
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.6
    • /
    • pp.757-764
    • /
    • 2013
  • A panoramic video which supports to make viewers feel an immersion through fitting to a wide field of view (FOV) larger than the human visual angle needs an interactive viewing method such as selecting targeted view point among widely viewing points of a panoramic video because it difficult to simultaneously view a whole panoramic video due to a limited viewing environment and bandwidth. When a user officially uses a view navigation in order to select a view point, it happens waste of resources such as bandwidth owing to the transmitted video data of unnecessary view points. Therefore, this paper proposes the spatial-temporal prediction filter (STPF) which is based on the direction and velocity of the view navigation for transmitting only the necessary video data. As a result of simulation, STPF reduces bitrate saving rates by from 6% to 37% compared to conventional methods in the interactive panoramic video streaming service required high bandwidth.

City-Scale Modeling for Street Navigation

  • Huang, Fay;Klette, Reinhard
    • Journal of information and communication convergence engineering
    • /
    • v.10 no.4
    • /
    • pp.411-419
    • /
    • 2012
  • This paper proposes a semi-automatic image-based approach for 3-dimensional (3D) modeling of buildings along streets. Image-based urban 3D modeling techniques are typically based on the use of aerial and ground-level images. The aerial image of the relevant area is extracted from publically available sources in Google Maps by stitching together different patches of the map. Panoramic images are common for ground-level recording because they have advantages for 3D modeling. A panoramic video recorder is used in the proposed approach for recording sequences of ground-level spherical panoramic images. The proposed approach has two advantages. First, detected camera trajectories are more accurate and stable (compared to methods using multi-view planar images only) due to the use of spherical panoramic images. Second, we extract the texture of a facade of a building from a single panoramic image. Thus, there is no need to deal with color blending problems that typically occur when using overlapping textures.

Development of 3D Car Navigation System Using Image-based Virtual Environment (실사기반 가상환경기술을 이용한 차량용 3차원 네비게이션 시스템 개발)

  • Kim Chang-Hyun;Lee Wan-Bok
    • Journal of Game and Entertainment
    • /
    • v.2 no.1
    • /
    • pp.35-44
    • /
    • 2006
  • Objective of this study is to develop a 3D car navigation system that shows the driving direction to a destination through real-time 3-D panoramic views of the route. For the purpose, a new searching process was established to find the optimal driving direction based on the driver's current location and the real-time traffic situation and the TIP (tour into the picture) method was extended to implement a wide virtual environment. A virtual environment was built up by applying the extended TIP method to the panoramic images taken at a constant distance from a real road, and then, displayed 3-D navigation as clear as the real images. The car navigation system developed in this study provides the optimal driving direction and real-time traffic situation using 2-D navigation module and 3D navigation module.

  • PDF

A Hybrid Positioning System for Indoor Navigation on Mobile Phones using Panoramic Images

  • Nguyen, Van Vinh;Lee, Jong-Weon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.3
    • /
    • pp.835-854
    • /
    • 2012
  • In this paper, we propose a novel positioning system for indoor navigation which helps a user navigate easily to desired destinations in an unfamiliar indoor environment using his mobile phone. The system requires only the user's mobile phone with its basic equipped sensors such as a camera and a compass. The system tracks user's positions and orientations using a vision-based approach that utilizes $360^{\circ}$ panoramic images captured in the environment. To improve the robustness of the vision-based method, we exploit a digital compass that is widely installed on modern mobile phones. This hybrid solution outperforms existing mobile phone positioning methods by reducing the error of position estimation to around 0.7 meters. In addition, to enable the proposed system working independently on mobile phone without the requirement of additional hardware or external infrastructure, we employ a modified version of a fast and robust feature matching scheme using Histogrammed Intensity Patch. The experiments show that the proposed positioning system achieves good performance while running on a mobile phone with a responding time of around 1 second.

Panoramic Navigation using Orthogonal Cross Cylinder Mapping and Image-Segmentation Based Environment Modeling (직각 교차 실린더 매핑과 영상 분할 기반 환경 모델링을 이용한 파노라마 네비게이션)

  • 류승택;조청운;윤경현
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.30 no.3_4
    • /
    • pp.138-148
    • /
    • 2003
  • Orthogonal Cross Cylinder mapping and segmentation based modeling methods have been implemented for constructing the image-based navigation system in this paper. The Orthogonal Cross Cylinder (OCC) is the object expressed by the intersection area that occurs when a cylinder is orthogonal with another. OCC mapping method eliminates the singularity effect caused in the environment maps and shows an almost even amount of area for the environment occupied by a single texel. A full-view image from a fixed point-of-view can be obtained with OCC mapping although it becomes difficult to express another image when the point-of-view has been changed. The OCC map is segmented according to the objects that form the environment and the depth value is set by the characteristics of the classified objects for the segmentation based modeling. This method can easily be implemented on an environment map and makes the environment modeling easier through extracting the depth value by the image segmentation. An environment navigation system with a full-view can be developed with these methods.

View Interpolation Algorithm for Continuously Changing Viewpoints in the Multi-panorama Based Navigatio (다중 파노라마 영상기반 네비게이션에서 연속적인 시점이동을 위한 장면보간 방법)

  • 김대현;최종수
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.40 no.6
    • /
    • pp.141-148
    • /
    • 2003
  • This paper proposes a new algorithm that generates the smooth and realistic transition views from one viewpoint to another on the multi-panorama based navigation system. The proposed algorithm is composed of two steps. One is prewarping that aligns the viewing directions of two panoramic images, and the other is the bidirectional disparity morphing(BDM) that generates the intermediate scene from the aligned panoramic images. For prewarping, we compute the phase correlation between two images in order to obtain the information, such as translation, rotation, and scaling. Then we align the viewing directions of two original images using these information. Afterprewarping, we compute the block based disparity vector(DV) and smooth them using two occluding patterns. As we apply these DVs to the BDM, we can generate the elaborate intermediate scene. We make an experiment on the proposed algorithm with some real panoramic images and obtain good quality intermediate scenes.

Omni Camera Vision-Based Localization for Mobile Robots Navigation Using Omni-Directional Images (옴니 카메라의 전방향 영상을 이용한 이동 로봇의 위치 인식 시스템)

  • Kim, Jong-Rok;Lim, Mee-Seub;Lim, Joon-Hong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.3
    • /
    • pp.206-210
    • /
    • 2011
  • Vision-based robot localization is challenging due to the vast amount of visual information available, requiring extensive storage and processing time. To deal with these challenges, we propose the use of features extracted from omni-directional panoramic images and present a method for localization of a mobile robot equipped with an omni-directional camera. The core of the proposed scheme may be summarized as follows : First, we utilize an omni-directional camera which can capture instantaneous $360^{\circ}$ panoramic images around a robot. Second, Nodes around the robot are extracted by the correlation coefficients of Circular Horizontal Line between the landmark and the current captured image. Third, the robot position is determined from the locations by the proposed correlation-based landmark image matching. To accelerate computations, we have assigned the node candidates using color information and the correlation values are calculated based on Fast Fourier Transforms. Experiments show that the proposed method is effective in global localization of mobile robots and robust to lighting variations.

Real-time Omni-directional Distance Measurement with Active Panoramic Vision

  • Yi, Soo-Yeong;Choi, Byoung-Wook;Ahuja, Narendra
    • International Journal of Control, Automation, and Systems
    • /
    • v.5 no.2
    • /
    • pp.184-191
    • /
    • 2007
  • Autonomous navigation of mobile robot requires a ranging system for measurement of distance to environmental objects. It is obvious that the wider and the faster distance measurement gives a mobile robot more freedom in trajectory planning and control. The active omni-directional ranging system proposed in this paper is capable of obtaining the distance for all 3600 directions in real-time because of the omni-directional mirror and the structured light. Distance computation including the sensitivity analysis and the experiments on the omni-directional ranging are presented to verify the effectiveness of the proposed system.

An algorithm of the natural view transition in the panoramic image based navigation using Fast Fourier Transform Techniques (파노라마 영상 기반 네비게이션에서 FFT 기술을 이용한 자연스러운 장면 전환 알고리즘)

  • Kim, Dae-Hyun;Choi, Jong-Soo;Kim, Tae-Eun
    • The KIPS Transactions:PartB
    • /
    • v.10B no.5
    • /
    • pp.561-566
    • /
    • 2003
  • This paper proposes a new algorithm that generates smooth and realistic transition views from one viewpoint to another view point on the panorama based navigation system. The proposed algorithm is composed with two steps. One is prewarping that aligns the viewing direction in two panorama image, the other is bidirectional disparity morphing (BDM) that generates the intermediate scene from the aligned panorama images. For the prewarping, first of all, we compute the phase correlation between two images in order to get the information such as the displacement, rotation, and scale. Then we align the original images using these information. As soon as finishing the prewarping, we compute the block based disparity vectors (DVs) and smooth them using the two occluding patterns. As we apply these DVs to the BDM, we can get the elaborate intermediate scenes. We make an experiment on the proposed algorithm with real panoramic images and we can get the satisfactory results.

Navigation based on Multi Cylindrical Environment Map

  • Park, Youngsup;Hyekyung Ko;Cheungwoon Cho;Kyunghyun Yoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.167.6-167
    • /
    • 2001
  • The cylindrical environment maps of image-based representation methods make high-quality, simple and low-price real-time navigation possible. In this paper, we propose a method to navigate from one viewpoint to the next in the virtual inside space, composed of several cylindrical environment maps. Our system is classified into the two modules. first of all, the panoramic image viewer that employs the rotation and zoom-in/out methods to navigate the virtual inside space, such as the Quicklime VR. The other is smooth real-time navigation using cubic mesh interpolation when the viewpoint moves from one environment map to another in the virtual space.

  • PDF