• 제목/요약/키워드: Laser structured light

검색결과 74건 처리시간 0.031초

Development of Color 3D Scanner Using Laser Structured-light Imaging Method

  • Ko, Youngjun;Yi, Sooyeong
    • Current Optics and Photonics
    • /
    • 제2권6호
    • /
    • pp.554-562
    • /
    • 2018
  • This study presents a color 3D scanner based on the laser structured-light imaging method that can simultaneously acquire 3D shape data and color of a target object using a single camera. The 3D data acquisition of the scanner is based on the structured-light imaging method, and the color data is obtained from a natural color image. Because both the laser image and the color image are acquired by the same camera, it is efficient to obtain the 3D data and the color data of a pixel by avoiding the complicated correspondence algorithm. In addition to the 3D data, the color data is helpful for enhancing the realism of an object model. The proposed scanner consists of two line lasers, a color camera, and a rotation table. The line lasers are deployed at either side of the camera to eliminate shadow areas of a target object. This study addresses the calibration methods for the parameters of the camera, the plane equations covered by the line lasers, and the center of the rotation table. Experimental results demonstrate the performance in terms of accurate color and 3D data acquisition in this study.

3D Map Building of The Mobile Robot Using Structured Light

  • Lee, Oon-Kyu;Kim, Min-Young;Cho, Hyung-Suck;Kim, Jae-Hoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.123.1-123
    • /
    • 2001
  • For Autonomous navigation of the mobile robots, the robots' capability to recognize 3D environment is necessary. In this paper, an on-line 3D map building method for autonomous mobile robots is proposed. To get range data on the environment, we use an sensor system which is composed of a structured light and a CCD camera based on optimal triangulation. The structured laser is projected as a horizontal strip on the scene. The sensor system can rotate $\pm$ $30{\Circ}$ with a goniometer. Scanning the system, we get the laser strip image for the environments and update planes composing the environment by some image processing steps. From the laser strip on the captured image, we find a center point of each column, and make line segments through blobbing these center poings. Then, the planes of the environments are updated. These steps are done on-line in scanning phase. With the proposed method, we can efficiently get a 3D map about the structured environment.

  • PDF

3D Map Building of the Mobile Robot Using Structured Light

  • Lee, Oon-Kyu;Kim, Min-Young;Cho, Hyung-Suck;Kim, Jae-Hoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.123.5-123
    • /
    • 2001
  • For autonomous navigation of the mobile robots, the robots' capability to recognize 3D environment is necessary. In this paper, an on-line 3D map building method for autonomous mobile robots is proposed. To get range data on the environment, we use a sensor system which is composed of a structured light and a CCD camera based on optimal triangulation. The structured laser is projected as a horizontal strip on the scene. The sensor system can rotate$\pm$30$^{\circ}$ with a goniometer. Scanning the system, we get the laser strip image for the environments and update planes composing the environment by some image processing steps. From the laser strip on the captured image, we find a center point of each column, and make line segments through blobbing these center points. Then, the planes of the environments are updated. These steps are done on-line in scanning phase. With the proposed method, we can efficiently get a 3D map about the structured environment.

  • PDF

Structured Light 기법을 이용한 이동 로봇의 상대 위치 추정 알고리즘 연구 (A Study on the Relative Localization Algorithm for Mobile Robots using a Structured Light Technique)

  • 노동기;김곤우;이범희
    • 제어로봇시스템학회논문지
    • /
    • 제11권8호
    • /
    • pp.678-687
    • /
    • 2005
  • This paper describes a relative localization algorithm using odometry data and consecutive local maps. The purpose of this paper is the odometry error correction using the area matching of two consecutive local maps. The local map is built up using a sensor module with dual laser beams and USB camera. The range data form the sensor module is measured using the structured lighting technique (active stereo method). The advantage in using the sensor module is to be able to get a local map at once within the camera view angle. With this advantage, we propose the AVS (Aligned View Sector) matching algorithm for. correction of the pose error (translational and rotational error). In order to evaluate the proposed algorithm, experiments are performed in real environment.

조명잡음에 강인한 구조광 영상기반 거리측정 센서 (Illumination Invariant Ranging Sensor Based on Structured Light Image)

  • 신진;이수영
    • 조명전기설비학회논문지
    • /
    • 제24권12호
    • /
    • pp.122-130
    • /
    • 2010
  • This paper presents an active ranging system based on laser structured-light image. The structured-light image processing is computationally efficient in comparison with the conventional stereo image processing, since the burdensome correspondence problem is avoidable. In order to achieve robustness against environmental illumination noise, an efficient image processing algorithm, i.e., integration of difference images with structured-light modulation is proposed. Distance equation from the measured structured light pixel distance and system parameter calibration are addressed in this paper. Experiments and analysis are carried out to verify performance of the proposed ranging system.

레이저 구조광을 이용한 3차원 컴퓨터 시각 형상정보 연속 측정 시스템 개발 (Development of the Computer Vision based Continuous 3-D Feature Extraction System via Laser Structured Lighting)

  • 임동혁;황헌
    • Journal of Biosystems Engineering
    • /
    • 제24권2호
    • /
    • pp.159-166
    • /
    • 1999
  • A system to extract continuously the real 3-D geometric fearture information from 2-D image of an object, which is fed randomly via conveyor has been developed. Two sets of structured laser lightings were utilized. And the laser structured light projection image was acquired using the camera from the signal of the photo-sensor mounted on the conveyor. Camera coordinate calibration matrix was obtained, which transforms 2-D image coordinate information into 3-D world space coordinate using known 6 points. The maximum error after calibration showed 1.5 mm within the height range of 103mm. The correlation equation between the shift amount of the laser light and the height was generated. Height information estimated after correlation showed the maximum error of 0.4mm within the height range of 103mm. An interactive 3-D geometric feature extracting software was developed using Microsoft Visual C++ 4.0 under Windows system environment. Extracted 3-D geometric feature information was reconstructed into 3-D surface using MATLAB.

  • PDF

구조광 영상기반 전방향 거리측정 시스템 개발 (Development of Omnidirectional Ranging System Based on Structured Light Image)

  • 신진;이수영
    • 제어로봇시스템학회논문지
    • /
    • 제18권5호
    • /
    • pp.479-486
    • /
    • 2012
  • In this paper, a ranging system is proposed that is able to measure 360 degree omnidirectional distances to environment objects. The ranging system is based on the structured light imaging system with catadioptric omnidirectional mirror. In order to make the ranging system robust against environmental illumination, efficient structured light image processing algorithms are developed; sequential integration of difference images with modulated structured light and radial search based on Bresenham line drawing algorithm. A dedicated FPGA image processor is developed to speed up the overall image processing. Also the distance equation is derived in the omnidirectional imaging system with a hyperbolic mirror. It is expected that the omnidirectional ranging system is useful for mapping and localization of mobile robot. Experiments are carried out to verify the performance of the proposed ranging system.

Development of 3D scanner using structured light module based on variable focus lens

  • Kim, Kyu-Ha;Lee, Sang-Hyun
    • International Journal of Advanced Culture Technology
    • /
    • 제8권3호
    • /
    • pp.260-268
    • /
    • 2020
  • Currently, it is usually a 3D scanner processing method as a laser method. However, the laser method has a disadvantage of slow scanning speed and poor precision. Although optical scanners are used as a method to compensate for these shortcomings, optical scanners are closely related to the distance and precision of the object, and have the disadvantage of being expensive. In this paper, 3D scanner using variable focus lens-based structured light module with improved measurement precision was designed to be high performance, low price, and usable in industrial fields. To this end, designed a telecentric optical system based on a variable focus lens and connected to the telecentric mechanism of the step motor and lens to adjust the focus of the variable lens. Designed a connection structure with optimized scalability of hardware circuits that configures a stepper motor to form a system with a built-in processor. In addition, by applying an algorithm that can simultaneously acquire high-resolution texture image and depth information and apply image synthesis technology and GPU-based high-speed structured light processing technology, it is also stable for changes to external light. We will designed and implemented for further improving high measurement precision.

Laser pose calibration of ViSP for precise 6-DOF structural displacement monitoring

  • Shin, Jae-Uk;Jeon, Haemin;Choi, Suyoung;Kim, Youngjae;Myung, Hyun
    • Smart Structures and Systems
    • /
    • 제18권4호
    • /
    • pp.801-818
    • /
    • 2016
  • To estimate structural displacement, a visually servoed paired structured light system (ViSP) was proposed in previous studies. The ViSP is composed of two sides facing each other, each with one or two laser pointers, a 2-DOF manipulator, a camera, and a screen. By calculating the positions of the laser beams projected onto the screens and rotation angles of the manipulators, relative 6-DOF displacement between two sides can be estimated. Although the performance of the system has been verified through various simulations and experimental tests, it has a limitation that the accuracy of the displacement measurement depends on the alignment of the laser pointers. In deriving the kinematic equation of the ViSP, the laser pointers were assumed to be installed perfectly normal to the same side screen. In reality, however, this is very difficult to achieve due to installation errors. In other words, the pose of laser pointers should be calibrated carefully before measuring the displacement. To calibrate the initial pose of the laser pointers, a specially designed jig device is made and employed. Experimental tests have been performed to validate the performance of the proposed calibration method and the results show that the estimated displacement with the initial pose calibration increases the accuracy of the 6-DOF displacement estimation.