• 제목/요약/키워드: Multi-View Range Image

Search Result 33, Processing Time 0.017 seconds

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot (이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합)

  • Kim, Min-Young;Ahn, Sang-Tae;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

The Ground Checkout Test of OSMI on KOMPSAT-1

  • Yong, Sang-Soon;Shim, Hyung-Sik;Heo, Haeng-Pal;Cho, Young-Min;Oh, Kyoung-Hwan;Woo, Sun-Hee;Paik, Hong-Yul
    • Korean Journal of Remote Sensing
    • /
    • v.15 no.4
    • /
    • pp.297-305
    • /
    • 1999
  • Ocean Scanning Multispectral Imager (OSMI) is a payload on the KOMPSAT satellite to perform global ocean color monitoring for the study of biological oceanography. The instrument images the ocean surface using a wisk-broom motion with a swath width of 800km and a ground sample distance (GSD) of < 1km over the entire field of view (FOV). The instrument is designed to have an on-orbit operation duty cycle of 20% over the mission lifetime of 3 years with the functions of programmable gain/offset and on-board image data compression/storage. The instrument also performs sun and dark calibration for on-board instrument calibration. The OSMI instrument is a multi-spectral imager covering the spectral range from 400nm to 900nm using CCD Focal Plane Array (FPA). The ocean colors are monitored using 6 spectral channels that can be selected via ground commands. KOMPSAT satellite with OSMI was integrated and the satellite level environment tests including instrument aliveness/functional test, such as launch environment, on-orbit environment (Thermal/Vacuum) and EMI/EMC test were performed at KARl. Test results met the requirements and the OSMI data were collected and analyzed during each test phase. The instrument is launched on the KOMPSAT satellite on December 21,1999 and is scheduled to start collecting ocean color data in the early 2000 upon completion of on-orbit instrument checkout.

Characteristics of the Electro-Optical Camera(EOC) (다목적실용위성탑재 전자광학카메라(EOC)의 성능 특성)

  • Seunghoon Lee;Hyung-Sik Shim;Hong-Yul Paik
    • Korean Journal of Remote Sensing
    • /
    • v.14 no.3
    • /
    • pp.213-222
    • /
    • 1998
  • Electro-Optical Camera(EOC) is the main payload of the KOrea Multi-Purpose SATellite(KOMPSAT) with the mission of cartography to build up a digital map of Korean territory including a Digital Terrain Elevation Map(DTEM). This instalment which comprises EOC Sensor Assembly and EOC Electronics Assembly produces the panchromatic images of 6.6 m GSD with a swath wider than 17 km by push-broom scanning and spacecraft body pointing in a visible range of wavelength, 510~730 nm. The high resolution panchromatic image is to be collected for 2 minutes during 98 minutes of orbit cycle covering about 800 km along ground track, over the mission lifetime of 3 years with the functions of programmable gain/offset and on-board image data storage. The image of 8 bit digitization, which is collected by a full reflective type F8.3 triplet without obscuration, is to be transmitted to Ground Station at a rate less than 25 Mbps. EOC was elaborated to have the performance which meets or surpasses its requirements of design phase. The spectral response, the modulation transfer function, and the uniformity of all the 2592 pixel of CCD of EOC are illustrated as they were measured for the convenience of end-user. The spectral response was measured with respect to each gain setup of EOC and this is expected to give the capability of generating more accurate panchromatic image to the users of EOC data. The modulation transfer function of EOC was measured as greater than 16 % at Nyquist frequency over the entire field of view, which exceeds its requirement of larger than 10 %. The uniformity that shows the relative response of each pixel of CCD was measured at every pixel of the Focal Plane Array of EOC and is illustrated for the data processing.