• Title/Summary/Keyword: Omni-directional image

Search Result 54, Processing Time 0.025 seconds

Omni-directional Image Generation Algorithm with Parametric Image Compensation (변수화된 영상 보정을 통한 전방향 영상 생성 방법)

  • Kim, Yu-Na;Sim, Dong-Gyu
    • Journal of Broadcast Engineering
    • /
    • v.11 no.4 s.33
    • /
    • pp.396-406
    • /
    • 2006
  • This paper proposes an omni-directional image generation algorithm with parametric image compensation. The algorithm generates an omni-directional image by transforming each planar image to the spherical image on spherical coordinate. Parametric image compensation method is presented in order to compensate vignetting and illumination distortions caused by properties of a camera system and lighting condition. The proposed algorithm can generates realistic and seamless omni-directional video and synthesize any point of view from the stitched omni-directional image on the spherical image. Experimental results show that the proposed omni-directional system with vignetting and illumination compensation is approximately $1{\sim}4dB$ better than that which does not consider the said effects.

Localization of Mobile Robot Using Active Omni-directional Ranging System (능동 전방향 거리 측정 시스템을 이용한 이동로봇의 위치 추정)

  • Ryu, Ji-Hyung;Kim, Jin-Won;Yi, Soo-Yeong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.5
    • /
    • pp.483-488
    • /
    • 2008
  • An active omni-directional raging system using an omni-directional vision with structured light has many advantages compared to the conventional ranging systems: robustness against external illumination noise because of the laser structured light and computational efficiency because of one shot image containing $360^{\circ}$ environment information from the omni-directional vision. The omni-directional range data represents a local distance map at a certain position in the workspace. In this paper, we propose a matching algorithm for the local distance map with the given global map database, thereby to localize a mobile robot in the global workspace. Since the global map database consists of line segments representing edges of environment object in general, the matching algorithm is based on relative position and orientation of line segments in the local map and the global map. The effectiveness of the proposed omni-directional ranging system and the matching are verified through experiments.

Coordinate Calibration and Object Tracking of the ODVS (Omni-directional Image에서의 이동객체 좌표 보정 및 추적)

  • Park, Yong-Min;Nam, Hyun-Jung;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • v.9 no.2
    • /
    • pp.408-413
    • /
    • 2005
  • This paper presents a technique which extracts a moving object from omni-directional images and estimates a real coordinates of the moving object using 3D parabolic coordinate transformation. To process real-time, a moving object was extracted by proposed Hue histogram Matching Algorithms. We demonstrate our proposed technique could extract a moving object strongly without effects of light changing and estimate approximation values of real coordinates with theoretical and experimental arguments.

  • PDF

Localization of 3D Spatial Information from Single Omni-Directional Image (단일 전방향 영상을 이용한 공간 정보의 측정)

  • Kang Hyun-Deok;Jo Kang-Hyun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.7
    • /
    • pp.686-692
    • /
    • 2006
  • This paper shows the calculation of 3D geometric information such as height, direction and distance under the constraints of a catadioptric camera system. The catadioptric camera system satisfies the single viewpoint constraints adopting hyperboloidal mirror. To calculate the 3D information with a single omni-directional image, the points are assumed to lie in perpendicular to the ground. The infinite plane is also detected as a circle from the structure of the mirror and camera. The analytic experiments verify the correctness of theory using real images taken in indoor environments like rooms or corridors. Thus, the experimental results show the applicability to calculate the 3D geometric information using single omni-directional images.

Omni Camera Vision-Based Localization for Mobile Robots Navigation Using Omni-Directional Images (옴니 카메라의 전방향 영상을 이용한 이동 로봇의 위치 인식 시스템)

  • Kim, Jong-Rok;Lim, Mee-Seub;Lim, Joon-Hong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.3
    • /
    • pp.206-210
    • /
    • 2011
  • Vision-based robot localization is challenging due to the vast amount of visual information available, requiring extensive storage and processing time. To deal with these challenges, we propose the use of features extracted from omni-directional panoramic images and present a method for localization of a mobile robot equipped with an omni-directional camera. The core of the proposed scheme may be summarized as follows : First, we utilize an omni-directional camera which can capture instantaneous $360^{\circ}$ panoramic images around a robot. Second, Nodes around the robot are extracted by the correlation coefficients of Circular Horizontal Line between the landmark and the current captured image. Third, the robot position is determined from the locations by the proposed correlation-based landmark image matching. To accelerate computations, we have assigned the node candidates using color information and the correlation values are calculated based on Fast Fourier Transforms. Experiments show that the proposed method is effective in global localization of mobile robots and robust to lighting variations.

A Study of Selecting Sequential Viewpoint and Examining the Effectiveness of Omni-directional Angle Image Information in Grasping the Characteristics of Landscape (경관 특성 파악에 있어서의 시퀀스적 시점장 선정과 전방위 화상정보의 유효성 검증에 관한 연구)

  • Kim, Heung Man;Lee, In Hee
    • KIEAE Journal
    • /
    • v.9 no.2
    • /
    • pp.81-90
    • /
    • 2009
  • Relating to grasping sequential landscape characteristics in consideration of the behavioral characteristics of the subject experiencing visual perception, this study was made on the subject of main walking line section for visitors of three treasures of Buddhist temples. Especially, as a method of obtaining data for grasping sequential visual perception landscape, the researcher employed [momentum sequential viewpoint setup] according to [the interval of pointers arbitrarily] and fisheye-lens-camera photography using the obtained omni-directional angle visual perception information. As a result, in terms of viewpoint selection, factors like approach road form, change in circulation axis, change in the ground surface level, appearance of objects, etc. were verified to make effect, and among these, approach road form and circulation axis change turned out to be the greatest influences. In addition, as a result of reviewing the effectiveness via the subjects, for the sake of qualitative evaluation of landscape components using the VR picture image obtained in the process of acquiring omni-directional angle visual perception information, a positive result over certain values was earned in terms of panoramic vision, scene reproduction, three-dimensional perspective, etc. This convinces us of the possibility to activate the qualitative evaluation of omni-directional angle picture information and the study of landscape through it henceforth.

Moving Target Tracking using Vision System for an Omni-directional Wheel Robot (전방향 구동 로봇에서의 비젼을 이용한 이동 물체의 추적)

  • Kim, San;Kim, Dong-Hwan
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.10
    • /
    • pp.1053-1061
    • /
    • 2008
  • In this paper, a moving target tracking using a binocular vision for an omni-directional mobile robot is addressed. In the binocular vision, three dimensional information on the target is extracted by vision processes including calibration, image correspondence, and 3D reconstruction. The robot controller is constituted with SPI(serial peripheral interface) to communicate effectively between robot master controller and wheel controllers.

Global Localization of Mobile Robots Using Omni-directional Images (전방위 영상을 이용한 이동 로봇의 전역 위치 인식)

  • Han, Woo-Sup;Min, Seung-Ki;Roh, Kyung-Shik;Yoon, Suk-June
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.31 no.4
    • /
    • pp.517-524
    • /
    • 2007
  • This paper presents a global localization method using circular correlation of an omni-directional image. The localization of a mobile robot, especially in indoor conditions, is a key component in the development of useful service robots. Though stereo vision is widely used for localization, its performance is limited due to computational complexity and its narrow view angle. To compensate for these shortcomings, we utilize a single omni-directional camera which can capture instantaneous $360^{\circ}$ panoramic images around a robot. Nodes around a robot are extracted by the correlation coefficients of CHL (Circular Horizontal Line) between the landmark and the current captured image. After finding possible near nodes, the robot moves to the nearest node based on the correlation values and the positions of these nodes. To accelerate computation, correlation values are calculated based on Fast Fourier Transforms. Experimental results and performance in a real home environment have shown the feasibility of the method.

Depth estimation by using a double conic projection (이중원뿔 투영을 이용한 거리의 추정)

  • 김완수;조형석;김성권
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1411-1414
    • /
    • 1997
  • It is essential to obtain a distane informaion in order to completely execute assembly tasks such as a grasping and an insertion. In this paper, we propose a method estimating a measurement distance from a sensor to an object through using the omni-directional image sensing system for assembly(OISSA) and show its features and feasibility by a computer simulation. The method, utilizing a forwarded motion stereo technique, is simple to search the corresponding points and possible to immediatiely obtain a three-dimensional 2.pi.-shape information.

  • PDF

Two-Dimensional Depth Data Measurement using an Active Omni-Directional Range Sensor (전방향 능동 거리 센서를 이용한 2차원 거리 측정)

  • Joung, In-Soo;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.5 no.4
    • /
    • pp.437-445
    • /
    • 1999
  • Most autonomous mobile robots view only things in front of then, and as a result, they may collide with objects moving from the side or behind. To overcome this problem, an active omni-directional range sensor system has been built that can obtain an omni-directional depth map through the use of a laser conic plane and a conic mirror. In the navigation of the mobile robot, the proposed sensor system produces a laser conic plane by rotating the laser point source at high speed: this creates a two-dimensional depth map, in real time, once an image is captured. The results obtained from experiment show that the proposed sensor system is very efficient, and can be utilized for navigation of mobile robot in an unknown environment.

  • PDF