• Title/Summary/Keyword: Structured stripe system

Search Result 12, Processing Time 0.024 seconds

Distance Measurement Based on Structured Light Image for Mobile Robots (이동로봇을 위한 구조광 영상기반 거리측정)

  • Yi, Soo-Yeong;Hong, Young-Jin;Suh, Jin-Ho
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.1
    • /
    • pp.18-24
    • /
    • 2010
  • In this paper, we address an active ranging system based on laser structured light image for mobile robot application. Since the burdensome correspondence problem is avoidable, the structured light image processing has efficient computation in comparison with the conventional stereo image processing. By using a cylindrical lens in the laser generation, it is possible to convert a point laser into a stripe laser without motorized scan in the proposed system. In order to achieve robustness against environmental illumination noise, we propose an efficient integro-differential image processing algorithm. The proposed system has embedded image processing module and transmits distance data to reduce the computational burden in main control system.

Development of 3D Scanner Based on Laser Structured-light Image (레이저 구조광 영상기반 3차원 스캐너 개발)

  • Ko, Young-Jun;Yi, Soo-Yeong;Lee, Jun-O
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.3
    • /
    • pp.186-191
    • /
    • 2016
  • This paper addresses the development of 3D data acquisition system (3D scanner) based laser structured-light image. The 3D scanner consists of a stripe laser generator, a conventional camera, and a rotation table. The stripe laser onto an object has distortion according to 3D shape of an object. By analyzing the distortion of the laser stripe in a camera image, the scanner obtains a group of 3D point data of the object. A simple semiconductor stripe laser diode is adopted instead of an expensive LCD projector for complex structured-light pattern. The camera has an optical filter to remove illumination noise and improve the performance of the distance measurement. Experimental results show the 3D data acquisition performance of the scanner with less than 0.2mm measurement error in 2 minutes. It is possible to reconstruct a 3D shape of an object and to reproduce the object by a commercially available 3D printer.

Automatic Registration of Two Parts using Robot with Multiple 3D Sensor Systems

  • Ha, Jong-Eun
    • Journal of Electrical Engineering and Technology
    • /
    • v.10 no.4
    • /
    • pp.1830-1835
    • /
    • 2015
  • In this paper, we propose an algorithm for the automatic registration of two rigid parts using multiple 3D sensor systems on a robot. Four sets of structured laser stripe system consisted of a camera and a visible laser stripe is used for the acquisition of 3D information. Detailed procedures including extrinsic calibration among four 3D sensor systems and hand/eye calibration of 3D sensing system on robot arm are presented. We find a best pose using search-based pose estimation algorithm where cost function is proposed by reflecting geometric constraints between sensor systems and target objects. A pose with minimum gap and height difference is found by greedy search. Experimental result using demo system shows the robustness and feasibility of the proposed algorithm.

Calibration of Structured Light Vision System using Multiple Vertical Planes

  • Ha, Jong Eun
    • Journal of Electrical Engineering and Technology
    • /
    • v.13 no.1
    • /
    • pp.438-444
    • /
    • 2018
  • Structured light vision system has been widely used in 3D surface profiling. Usually, it is composed of a camera and a laser which projects a line on the target. Calibration is necessary to acquire 3D information using structured light stripe vision system. Conventional calibration algorithms have found the pose of the camera and the equation of the stripe plane of the laser under the same coordinate system of the camera. Therefore, the 3D reconstruction is only possible under the camera frame. In most cases, this is sufficient to fulfill given tasks. However, they require multiple images which are acquired under different poses for calibration. In this paper, we propose a calibration algorithm that could work by using just one shot. Also, proposed algorithm could give 3D reconstruction under both the camera and laser frame. This would be done by using newly designed calibration structure which has multiple vertical planes on the ground plane. The ability to have 3D reconstruction under both the camera and laser frame would give more flexibility for its applications. Also, proposed algorithm gives an improvement in the accuracy of 3D reconstruction.

Implementation of vision system for a mobile robot using pulse phase difference & structured light (펄스 위상차와 스트럭춰드 라이트를 이용한 이동 로봇 시각 장치 구현)

  • 방석원;정명진;서일홍;오상록
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1991.10a
    • /
    • pp.652-657
    • /
    • 1991
  • Up to date, application areas of mobile robots have been expanded. In addition, Many types of LRF(Laser Range Finder) systems have been developed to acquire three dimensional information about unknown environments. However in real world, because of various noises (sunlight, fluorescent light), it is difficult to separate reflected laser light from these noise. To overcome the previous restriction, we have developed a new type vision system which enables a mobile robot to measure the distance to a object located 1-5 (m) ahead with an error than 2%. The separation and detection algorithm used in this system consists of pulse phase difference method and multi-stripe structured light. The effectiveness and feasibility of the proposed vision system are demonstrated by 3-D maps of detected objects and computation time analysis.

  • PDF

T-joint Laser Welding of Circular and Square Pipes Using the Vision Tracking System (용접선 추적 비전장치를 이용한 원형-사각 파이프의 T형 조인트 레이저용접)

  • Son, Yeong-Il;Park, Gi-Yeong;Lee, Gyeong-Don
    • Laser Solutions
    • /
    • v.12 no.1
    • /
    • pp.19-24
    • /
    • 2009
  • Because of its fast and precise welding performance, laser welding is becoming a new excellent welding method. However, the precise focusing and robust seam tracking are required to apply laser welding to the practical fields. In order to laser weld a type of T joint like a circular pipe on a square pipe, which could be met in the three dimensional structure such as an aluminum space frame, a visual sensor system was developed for automation of focusing and seam tracking. The developed sensor system consists of a digital CCD camera, a structured laser, and a vision processor. It is moved and positioned by a 2-axis motorized stage, which is attached to a 6 axis robot manipulator with a laser welding head. After stripe-type structured laser illuminates a target surface, images are captured through the digital CCD camera. From the image, seam error and defocusing error are calculated using image processing algorithms which includes efficient techniques handling continuously changed image patterns. These errors are corrected by the stage off-line during welding or teaching. Laser welding of a circular pipe on a square pipe was successful with the vision tracking system by reducing the path positioning and de focusing errors due to the robot teaching or a geometrical variation of specimens and jig holding.

  • PDF

A Bead Shape Classification Method using Neural Network in High Frequency Electric Resistance Welding (신경회로망을 이용한 고주파 전기 저항 용접 파이프의 비드 형상 분류)

  • Ko, K.W.;Kim, J.H.;Kong, W.I.
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.12 no.9
    • /
    • pp.86-94
    • /
    • 1995
  • Bead shape in high frequency electric resistance (HER) pipe welding gives useful information on judging current welding conditon. In most welding process, heat input is controlled by skilled operators observing color and shape of bead. In this paper, a visual monitoring system is designed to observe bead shape in HERW pipe welding process by using structured light beam and a C.I.D(Charge injection device) camera. To avoid some difficul- ties arising in extracting stable features of stripe pattern and classifying the extracted features, Kohonen neural network is used to classify such bead shapes. The experimental results show accurate classification performance of the proposed method.

  • PDF

Depth Measurement System Using Structured Light, Rotational Plane Mirror and Mono-Camera (선형 레이저와 회전 평면경 및 단일 카메라를 이용한 거리측정 시스템)

  • Yoon Chang-Bae;Kim Hyong-Suk;Lin Chun-Shin;Son Hong-Rak;Lee Hye-Jeong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.5
    • /
    • pp.406-410
    • /
    • 2005
  • A depth measurement system that consists of a single camera, a laser light source and a rotating mirror is investigated. The camera and the light source are fixed, facing the rotating mirror. The laser light is reflected by the mirror and projected to the scene objects whose locations are to be determined. The camera detects the laser light location on object surfaces through the same mirror. The scan over the area to be measured is done by mirror rotation. Advantages are 1) the image of the light stripe remains sharp while that of the background becomes blurred because of the mirror rotation and 2) the only rotating part of this system is the mirror but the mirror angle is not involved in depth computation. This minimizes the imprecision caused by a possible inaccurate angle measurement. The detail arrangement and experimental results are reported.

A Study on the Image Processing of Visual Sensor for Weld Seam Tracking in GMA Welding

  • Kim, J.-W.;Chung, K.-C.
    • International Journal of Korean Welding Society
    • /
    • v.1 no.2
    • /
    • pp.23-29
    • /
    • 2001
  • In this study, a preview-sensing visual sensor system is constructed far weld seam tracking in GMA welding. The visual sensor system consists of a CCD camera, a diode laser system with a cylindrical lens, and a band-pass-filter to overcome the degrading of image due to spatters and/or arc light. Among the image processing methods, Hough transform method is compared with the central difference method from a viewpoint of the capability for extracting the accurate feature position. As a result, it was revealed that Hough transform method can more accurately extract the feature positions and it can be applied to real time weld seam tracking. Image processing which includes Hough transform method is carried out to extract straight lines that express laser stripe. After extracting the lines, weld joint position and edge points are determined by intersecting the lines. Even though the image includes a spatter trace on it, it is possible to recognize the position of weld joint. Weld seam tracking was precisely implemented with adopting Hough transform method, and it is possible to track the weld seam in the case of offset angle is in the region of $\pm$ $15^{\circ}$.

  • PDF

A Study on the Image Processing of Visual Sensor for Weld Seam Tracking in GMA Welding (GMA 용접에서 용접선 추적용 시각센서의 화상처리에 관한 연구)

  • 정규철;김재웅
    • Journal of Welding and Joining
    • /
    • v.18 no.3
    • /
    • pp.60-67
    • /
    • 2000
  • In this study, we constructed a preview-sensing visual sensor system for weld seam tracking in GMA welding. The visual sensor consists of a CCD camera, a diode laser system with a cylindrical lens and a band-pass-filter to overcome the degrading of image due to spatters and/or arc light. To obtain weld joint position and edge points accurately from the captured image, we compared Hough transform method with central difference method. As a result, we present Hough transform method can more accurately extract the points and it can be applied to real time weld seam tracking. Image processing is carried out to extract straight lines that express laser stripe. After extracting the lines, weld joint position and edge points is determined by intersecting points of the lines. Although a spatter trace is in the image, it is possible to recognize the position of weld joint. Weld seam tracking was precisely implemented with adopting Hough transform method, and it is possible to track the weld seam in the case of offset angle is in the region of $\pm15^{\circ}$.

  • PDF