• Title/Summary/Keyword: 3D range finder

Search Result 35, Processing Time 0.029 seconds

Profile Management System of Material Piles by Dynamic Range Finding (동적 Range 검출에 의한 원료 Pile 형상 관리 시스템)

  • 안현식
    • Proceedings of the Korea Institute of Convergence Signal Processing
    • /
    • 2000.08a
    • /
    • pp.333-336
    • /
    • 2000
  • In this paper, a profile management system consisting of global and local range finders is presented for the automat ion of material pile handling. A global range finder detects range data of the front part of the piles of material and a profile map is obtained from a 3D profile detection algorithm. A local range finder attached on the side of the arm of the reclaimer detects range data with the handling function dynamically, and a local profile patch is acquired from the range data A yard profile map manager constructs a map by using the 3D profile of the global range finder and revises the map by replacing it with the local profile patch obtained Iron the local range finder. The developed vision system was applied to a simulator and the results of test show that it is appropriate to use for automating the material handling.

  • PDF

A Study on 3D Reconstruction of Urban Area

  • Park Y. M.;Kwon K. R.;Lee K. W.
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.470-473
    • /
    • 2005
  • This paper proposes a reconstruction method for the shape and color information of 3-dimensional buildings. The proposed method is range scanning by laser range finder and image coordinates' color information mapping to laser coordinate by a fixed CCD camera on laser range finder. And we make a 'Far-View' using high-resolution satellite image. The 'Far-View' is created that the height of building using DEM after contours of building extraction. The user select a region of 'Far View' and then, appear detailed 3D-reconstruction of building The outcomes apply to city plan, 3D-environment game and movie background etc.

  • PDF

A study on the theoretical minimum resolution of the laser range finder (레이저 거리계의 이론적 최소 분해능에 관한 연구)

  • 차영엽;권대갑
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.644-647
    • /
    • 1996
  • In this study the theoretical minimum resolution analysis of an active vision system using laser range finder is performed for surrounding recognition and 3D data acquisition in unknown environment. The laser range finder consists of a slitted laser beam generator, a scanning mechanism, CCD camera, and a signal processing unit. A laser beam from laser source is slitted by a set of cylindrical lenses and the slitted laser beam is emitted up and down and rotates by the scanning mechanism. The image of laser beam reflected on the surface of an object is engraved on the CCD array. In the result, the resolution of range data in laser range finder is depend on distance between lens center of CCD camera and light emitter, view and beam angles, and parameters of CCD camera.

  • PDF

A study on the resolution of the laser range finder (레이저 거리계의 분해능에 관한 연구)

  • Cha, Yeong-Yeop;Yu, Chang-Mok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.4 no.1
    • /
    • pp.82-87
    • /
    • 1998
  • In this study, the theoretical resolution analysis of an active vision system using laser range finder is performed for surrounding recognition and 3D data acquisition in unknown environment. In the result, the resolution of range data in laser range finder is depend on the distance between lens center of CCD camera and light emitter, view angle, beam angle, and parameters of CCD camera. The theoretical resolutions of the laser range finders of various types which are based on parameters effected resolution are calculated and experimental results are obtained in real system.

  • PDF

Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map (다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선)

  • Kim, Si-Jong;An, Kwang-Ho;Sung, Chang-Hun;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.4 no.4
    • /
    • pp.298-304
    • /
    • 2009
  • This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (${\Phi}$, ${\Delta}$) and a camera calibration matrix (K). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.

  • PDF

3D Simultaneous Localization and Map Building (SLAM) using a 2D Laser Range Finder based on Vertical/Horizontal Planar Polygons (2차원 레이저 거리계를 이용한 수직/수평 다각평면 기반의 위치인식 및 3차원 지도제작)

  • Lee, Seungeun;Kim, Byung-Kook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.11
    • /
    • pp.1153-1163
    • /
    • 2014
  • An efficient 3D SLAM (Simultaneous Localization and Map Building) method is developed for urban building environments using a tilted 2D LRF (Laser Range Finder), in which a 3D map is composed of perpendicular/horizontal planar polygons. While the mobile robot is moving, from the LRF scan distance data in each scan period, line segments on the scan plane are successively extracted. We propose an "expected line segment" concept for matching: to add each of these scan line segments to the most suitable line segment group for each perpendicular/horizontal planar polygon in the 3D map. After performing 2D localization to determine the pose of the mobile robot, we construct updated perpendicular/horizontal infinite planes and then determine their boundaries to obtain the perpendicular/horizontal planar polygons which constitute our 3D map. Finally, the proposed SLAM algorithm is validated via extensive simulations and experiments.

Estimation of People Tracking by Kalman Filter based Observations from Laser Range Sensor (레이저스케너 센서기반의 칼만필터 관측을 이용한 사람이동예측)

  • Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.22 no.3
    • /
    • pp.265-272
    • /
    • 2019
  • For tracking a varying number of people using laser range finder, it is important to deal with appearance/disappearance of people due to various causes including occlusions. We propose a method for tracking people with automatic initialization by integrating observations from laser range finder. In our method, the problem of estimating 2D positions and orientations of multiple people's walking direction is formulated based on a mixture kalman filter. Proposal distributions of a kalman filter are constructed by using a mixture model that incorporates information from a laser range scanner. Our experimental results demonstrate the effectiveness and robustness of our method.

Registration multiple range views (복수의 거리영상 간의 변환계수의 추출)

  • 정도현;윤일동;이상욱
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.34S no.2
    • /
    • pp.52-62
    • /
    • 1997
  • To reconstruct the complete 3-D shape of an object, seveal range images form different viewpoints should be merged into a single model. The process of extraction of the transformation parameters between multiple range views is calle dregistration. In this paper, we propose a new algorithm to find the transformation parameters between multiple range views. Th eproposed algorithm consists of two step: initial estimation and iteratively update the transformation. To guess the initial transformation, we modify the principal axes by considering the projection effect, due to the difference fo viewpoints. Then, the following process is iterated: in order to extract the exact transformation parameters between the range views: For every point of the common region, find the nearest point among the neighborhood of the current corresponding point whose correspondency is defined by the reverse calibration of the range finder. Then, update the transformation to satisfy the new correspondencies. In order to evaluate the performance the proposed registration algorithm, some experiments are performed on real range data, acquired by space encoding range finder. The experimental results show that the proposed initial estimation accelerate the following iterative registration step.

  • PDF