• Title/Summary/Keyword: 레이저 카메라 교정

Search Result 6, Processing Time 0.02 seconds

Object Width Measurement System Using Light Sectioning Method (광절단법을 이용한 물체 크기 측정 시스템)

  • Lee, Byeong-Ju;Kang, Hyun-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.3
    • /
    • pp.697-705
    • /
    • 2014
  • This paper presents a vision based object width measurement method and its application where the light sectioning method is employed. The target object for measurement is a tread, which is the most outside component of an automobile tire. The entire system applying the measurement method consists of two processes, i.e. a calibration process and a detection process. The calibration process is to identify the relationships between a camera plane and a laser plane, and to estimate a camera lens distortion parameters. As the process requires a test pattern, namely a jig, which is elaborately manufactured. In the detection process, first of all, the region that a laser light illuminates is extracted by applying an adaptive thresholding technique where the distribution of the pixel brightness is considered to decide the optimal threshold. Then, a thinning algorithm is applied to the region so that the ends and the shoulders of a tread are detected. Finally, the tread width and the shoulder width are computed using the homography and the distortion coefficients obtained by the calibration process.

A Study on Three-Dimensional Model Reconstruction Based on Laser-Vision Technology (레이저 비전 기술을 이용한 물체의 3D 모델 재구성 방법에 관한 연구)

  • Nguyen, Huu Cuong;Lee, Byung Ryong
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.32 no.7
    • /
    • pp.633-641
    • /
    • 2015
  • In this study, we proposed a three-dimensional (3D) scanning system based on laser-vision technique and rotary mechanism for automatic 3D model reconstruction. The proposed scanning system consists of a laser projector, a camera, and a turntable. For laser-camera calibration a new and simple method was proposed. 3D point cloud data of the surface of scanned object was fully collected by integrating extracted laser profiles, which were extracted from laser stripe images, corresponding to rotary angles of the rotary mechanism. The obscured laser profile problem was also solved by adding an addition camera at another viewpoint. From collected 3D point cloud data, the 3D model of the scanned object was reconstructed based on facet-representation. The reconstructed 3D models showed effectiveness and the applicability of the proposed 3D scanning system to 3D model-based applications.

System for Measuring the Welding Profile Using Vision and Structured Light (비전센서와 구조화빔을 이용한 용접 형상 측정 시스템)

  • Kim, Chang-Hyeon;Choe, Tae-Yong;Lee, Ju-Jang;Seo, Jeong;Park, Gyeong-Taek;Gang, Hui-Sin
    • Proceedings of the Korean Society of Laser Processing Conference
    • /
    • 2005.11a
    • /
    • pp.50-56
    • /
    • 2005
  • The robot systems are widely used in the many industrial field as well as welding manufacturing. The essential tasks to operate the welding robot are the acquisition of the position and/or shape of the parent metal. For the seam tracking or the robot tracking, many kinds of contact and non-contact sensors are used. Recently, the vision is most popular. In this paper, the development of the system which measures the shape of the welding part is described. This system uses the line-type structured laser diode and the vision sensor. It includes the correction of radial distortion which is often found in the image taken by the camera with short focal length. The Direct Linear Transformation (DLT) method is used for the camera calibration. The three dimensional shape of the parent metal is obtained after simple linear transformation. Some demos are shown to describe the performance of the developed system.

  • PDF

Improvement of size measurement polystyrene spheres of diameters 3$\mu$m and 10$\mu$m by optical microscope with CCD camera (CCD 카메라가 장착된 광학현미경을 사용한 폴리스티렌구 (3 $\mu$m와 10 $\mu$m)의 평균지름측정)

  • 정기영;박병천;깅주식;송원영;오범환
    • Korean Journal of Optics and Photonics
    • /
    • v.9 no.6
    • /
    • pp.362-367
    • /
    • 1998
  • Center Distance Finding (CDF) is a technique to find the sphere diameter by measuring the distance between two contacting spheres. The focal spots of the sphere clusters are formed in the back-focal plane by the transmission-mode optical microscope with the pseudothermal illumination source. Digital images taken by the CCD camera were processed by the software called Global Lab Image. The centers of the focal spots are found and the spot positions are expressed in terms of the CCD pixel elements, whose coordinate are calibrated by a heterodyne interferometer. The new CDF measurement system has been developed, which are more advantageous in time and convenience than the existing system, while the measurement uncertainly remains sufficient for its use as a magnification standard for optical microscopy. Two kinds of polystyrene spheres whose nominal diameters 3 and 10 $\mu\textrm{m}$ (NIST SRM 1962 and 1960) are measured with the uncertainly less than 1% at the confidence level of 99%, and the results are compared with the results of National Institute of Standards and Technology.

  • PDF

Development of Vision Sensor Module for the Measurement of Welding Profile (용접 형상 측정용 시각 센서 모듈 개발)

  • Kim C.H.;Choi T.Y.;Lee J.J.;Suh J.;Park K.T.;Kang H.S.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2006.05a
    • /
    • pp.285-286
    • /
    • 2006
  • The essential tasks to operate the welding robot are the acquisition of the position and/or shape of the parent metal. For the seam tracking or the robot automation, many kinds of contact and non-contact sensors are used. Recently, the vision sensor is most popular. In this paper, the development of the system which measures the profile of the welding part is described. The total system will be assembled into a compact module which can be attached to the head of welding robot system. This system uses the line-type structured laser diode and the vision sensor It implemented Direct Linear Transformation (DLT) for the camera calibration as well as radial distortion correction. The three dimensional shape of the parent metal is obtained after simple linear transformation and therefore, the system operates in real time. Some experiments are carried out to evaluate the performance of the developed system.

  • PDF

Evaluation of the usefulness of IGRT(Image Guided Radiation Therapy) for markerless patients using SGPS(Surface-Guided Patient Setup) (표면유도환자셋업(Surface-Guided Patient Setup, SGPS)을 활용한 Markerless환자의 영상유도방사선치료(Image Guided Radiation Therapy, IGRT)시 유용성 평가)

  • Lee, Kyeong-jae;Lee, Eung-man;Lee, Jeong-su;Kim, Da-yeon;Ko, Hyeon-jun;Choi, Shin-cheol
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.33
    • /
    • pp.109-116
    • /
    • 2021
  • Purpose: The purpose of this study is to evaluate the usefulness of Surface-Guided Patient Setup by comparing the patient positioning accuracy when image-guided radiation therapy was used for Markerless patients(unmarked on the skin) using Surface-Guided Patient Setup and Marker patients(marked on the skin) using Laser-Based Patient Setup. Materials And Methods: The position error during IGRT was compared between a Markerless patient initially set up with SGPS using an optical surface scanning system using three cameras and a Marker patient initially set up with LBPS that aligns the laser with the marker drawn on the patient's skin. Both SGPS and LBPS were performed on 20 prostate cancer patients and 10 Stereotactic Radiation Surgery patients, respectively, and SGPS was performed on an additional 60 breast cancer patients. All were performed IGRT using CBCT or OBI. Position error of 6 degrees of freedom was obtained using Auto-Matching System, and comparison and analysis were performed using Offline-Review in the treatment planning system. Result: The difference between the root mean square (RMS) of SGPS and LBPS in prostate cancer patients was Vrt -0.02cm, Log -0.02cm, Lat 0.01cm, Pit -0.01°, Rol -0.01°, Rtn -0.01°, SRS patients was Vrt 0.02cm, Log -0.05cm, Lat 0.00cm, Pit -0.30°, Rol -0.15°, Rtn -0.33°. there was no significant difference between the two regions. According to the IGRT standard of breast cancer patients, RMS was Vrt 0.26, Log 0.21, Lat 0.15, Pit 0.81, Rol 0.49, Rtn 0.59. Conclusion:. As a result of this study, the position error value of SGPS compared to LBPS did not show a significant difference between prostate cancer patients and SRS patients. In the case of additionally performed SGPS breast cancer patients, the position error value was not large based on IGRT. Therefore, it is considered that it will be useful to replace LBPS with SGPS, which has the great advantage of not requiring patient skin marking..