• 제목/요약/키워드: laser vision sensor

검색결과 170건 처리시간 0.03초

Experimental Study of Spacecraft Pose Estimation Algorithm Using Vision-based Sensor

  • Hyun, Jeonghoon;Eun, Youngho;Park, Sang-Young
    • Journal of Astronomy and Space Sciences
    • /
    • 제35권4호
    • /
    • pp.263-277
    • /
    • 2018
  • This paper presents a vision-based relative pose estimation algorithm and its validation through both numerical and hardware experiments. The algorithm and the hardware system were simultaneously designed considering actual experimental conditions. Two estimation techniques were utilized to estimate relative pose; one was a nonlinear least square method for initial estimation, and the other was an extended Kalman Filter for subsequent on-line estimation. A measurement model of the vision sensor and equations of motion including nonlinear perturbations were utilized in the estimation process. Numerical simulations were performed and analyzed for both the autonomous docking and formation flying scenarios. A configuration of LED-based beacons was designed to avoid measurement singularity, and its structural information was implemented in the estimation algorithm. The proposed algorithm was verified again in the experimental environment by using the Autonomous Spacecraft Test Environment for Rendezvous In proXimity (ASTERIX) facility. Additionally, a laser distance meter was added to the estimation algorithm to improve the relative position estimation accuracy. Throughout this study, the performance required for autonomous docking could be presented by confirming the change in estimation accuracy with respect to the level of measurement error. In addition, hardware experiments confirmed the effectiveness of the suggested algorithm and its applicability to actual tasks in the real world.

비전센서를 이용한 자동추적장치에 관한 연구 (A Study on Automatic Seam Tracking using Vision Sensor)

  • 전진환;조택동;양상민
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 1995년도 추계학술대회 논문집
    • /
    • pp.1105-1109
    • /
    • 1995
  • A CCD-camera, which is structured with vision system, was used to realize automatic seam-tracking system and 3-D information which is needed to generate torch path, was obtained by using laser-slip beam. To extract laser strip and obtain welding-specific point, Adaptive Hough-transformation was used. Although the basic Hough transformation takes too much time to process image on line, it has a tendency to be robust to the noises as like spatter. For that reson, it was complemented with Adaptive Hough transformation to have an on-line processing ability for scanning a welding-specific point. the dead zone,where the sensing of weld line is impossible, is eliminated by rotating the camera with its rotating axis centered at welding torch. The camera angle is controlled so as to get the minimum image data for the sensing of weld line, hence the image processing time is reduced. The fuzzy controller is adapted to control the camera angle.

  • PDF

자동 아크 용접공정의 용접개선변수 측정을 위한 시각 시스템 (Development of a vision sensor for measuring the weld groove parameters in arc welding process)

  • 김호학;부광석;조형석
    • Journal of Welding and Joining
    • /
    • 제8권2호
    • /
    • pp.58-69
    • /
    • 1990
  • In conventional arc welding, position error of the weld torch with respect to the weld seam and variation of groove dimension are induced by inaccurate fitup and fixturing. In this study, a vision system has been developed to recognize and compensate the position error and dimensional inaccuracy. The system uses a structured laser light illuminated on the weld groove and perceived by a C.C.D camera. A new algorithm to detect the edge of the reflected laser light is introduced for real time processing. The developed system was applied to arbitarary weld paths with various types of joint in arc welding process. The experimental results show that the proposed system can detect the weld groove parameters within good accuracy and yield good tracking performance.

  • PDF

다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선 (Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map)

  • 김시종;안광호;성창훈;정명진
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.298-304
    • /
    • 2009
  • This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (${\Phi}$, ${\Delta}$) and a camera calibration matrix (K). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.

  • PDF

구조화 레이저를 이용한 PC 기반 인-라인 검사 시스템 개발에 관한 연구 (A Study on Development of PC Based In-Line Inspection System with Structure Light Laser)

  • 신찬배;김진대;임학규;이재원
    • 한국정밀공학회지
    • /
    • 제22권11호
    • /
    • pp.82-90
    • /
    • 2005
  • Recently, the in-line vision inspection has become the subject of growing research area in the visual control systems and robotic intelligent fields that are required exact three-dimensional pose. The objective of this article is to study the pc based in line visual inspection with the hand-eye structure. This paper suggests three dimensional structured light measuring principle and design method of laser sensor header. The hand-eye laser sensor have been studied for a long time. However, it is very difficult to perform kinematical analysis between laser sensor and robot because the complicated mathematical process are needed for the real environments. In this problem, this paper will propose auto-calibration concept. The detail process of this methodology will be described. A new thinning algorithm and constrained hough transform method is also explained in this paper. Consequently, the developed in-line inspection module demonstrate the successful operation with hole, gap, width or V edge.

Inspection System of Welding Bead and Chamfer by means of Laser Vision

  • Lee, Jun-Ssok;Im, Pil-Ju;Park, Young-Jun;Kim, Jae-Hoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.118.4-118
    • /
    • 2001
  • An Inspection system, composed of sensor head and controller, is presented which is a 3-D laser vision system using principles of optical triangulation for weld quality and chamfer quality. The sensor head id composed of laser diode, micro CCD camera, filter and several optical components. This systems can be used in welding bead and undercut inspection and chamfer quality inspection as well. It is mech more convenient to use and the inspection time is to be greatly shortened compared with conventional inspection method. Furthermore, data saved in controller can be used for statistics afterwards. This system has been begin used in Koje Shipyard of Samsung Heavy Industries and the need is being increased.

  • PDF

센서데이터 융합을 이용한 원주형 물체인식 (Cylindrical Object Recognition using Sensor Data Fusion)

  • 김동기;윤광익;윤지섭;강이석
    • 제어로봇시스템학회논문지
    • /
    • 제7권8호
    • /
    • pp.656-663
    • /
    • 2001
  • This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.

  • PDF

A STUDY ON WELD POOL MONITORING IN PULSED LASER EDGE WELDING

  • Lee, Seung-Key;Na, Suck-Joo
    • 대한용접접합학회:학술대회논문집
    • /
    • 대한용접접합학회 2002년도 Proceedings of the International Welding/Joining Conference-Korea
    • /
    • pp.595-599
    • /
    • 2002
  • Edge welding of thin sheets is very difficult because of the fit-up problem and small weld area In laser welding, joint fit-up and penetration are critical for sound weld quality, which can be monitored by appropriate methods. Among the various monitoring systems, visual monitoring method is attractive because various kinds of weld pool information can be extracted directly. In this study, a vision sensor was adopted for the weld pool monitoring in pulsed Nd:YAG laser edge welding to monitor whether the penetration is enough and the joint fit-up is within the requirement. Pulsed Nd:YAG laser provides a series of periodic laser pulses, while the shape and brightness of the weld pool change temporally even in one pulse duration. The shutter-triggered and non-interlaced CCD camera was used to acquire a temporally changed weld pool image at the moment representing the weld status well. The information for quality monitoring can be extracted from the monitored weld pool image by an image processing algorithm. Weld pool image contains not only the information about the joint fit-up, but the penetration. The information about the joint fit-up can be extracted from the weld pool shape, and that about a penetration from the brightness. Weld pool parameters that represent the characteristics of the weld pool were selected based on the geometrical appearance and brightness profile. In order to achieve accurate prediction of the weld penetration, which is nonlinear model, neural network with the selected weld pool parameters was applied.

  • PDF

실린더 라이너 오일그루브 가공 로봇 시스템 개발 (Development of a grinding robot system for the oil groove of the engine cylinder liner)

  • 노태양;이윤식;정창욱;이지형;오용출
    • 대한기계학회:학술대회논문집
    • /
    • 대한기계학회 2008년도 추계학술대회A
    • /
    • pp.1075-1080
    • /
    • 2008
  • An engine for marine propulsion and power generation consists of several cylinder liner-piston sets. And the oil groove is on the cylinder liner inside wall for the lubrication between a piston and cylinder. The machining process of oil groove has been carried by manual work so far, because of the diversity of the shape. Recently, we developed an automatic grinding robot system for oil groove machining of engine cylinder liners. It can covers various types of oil grooves and adjust its position by itself. The grinding robot system consists of a robot, a machining tool head, sensors and a control system. The robot automatically recognizes the cylinder liner's inside configuration by using a laser displacement sensor and a vision sensor after the cylinder liner is placed on a set-up equipment.

  • PDF

3 차원 곡면 데이터 획득을 위한 멀티 레이져 비젼 시스템 개발 (Development of Multi-Laser Vision System For 3D Surface Scanning)

  • 이정환;권기연;이현철;도영칠;최두진;박진형;김대경;박영준
    • 대한기계학회:학술대회논문집
    • /
    • 대한기계학회 2008년도 추계학술대회A
    • /
    • pp.768-772
    • /
    • 2008
  • Various scanning systems have been studied in many industrial areas to acquire a range data or to reconstruct an explicit 3D model. Currently optical technology has been used widely by virtue of noncontactness and high-accuracy. In this paper, we describe a 3D laser scanning system developped to reconstruct the 3D surface of a large-scale object such as a curved-plate of ship-hull. Our scanning system comprises of 4ch-parallel laser vision modules using a triangulation technique. For multi laser vision, calibration method based on least square technique is applied. In global scanning, an effective method without solving difficulty of matching problem among the scanning results of each camera is presented. Also minimal image processing algorithm and robot-based calibration technique are applied. A prototype had been implemented for testing.

  • PDF