• Title/Summary/Keyword: Laser Vision

Search Result 334, Processing Time 0.024 seconds

Implementation of Automatic Teaching System for Subassembly Process in Shipbuilding (선박 소조립 공정용 로봇 자동교시 시스템의 구현)

  • 김정호;유중돈;김진오;신정식;김성권
    • Journal of Welding and Joining
    • /
    • v.14 no.2
    • /
    • pp.96-105
    • /
    • 1996
  • Robot systems are widely utilized in the shipbuilding industry to enhance the productivity by automating the welding process. In order to increase productivity, it is necessary to reduce the time used for robot teaching. In this work, the automatic teaching system is developed for the subassembly process in the shipbuilding industry. A alser/vision sensor is designed to detect the weld seam and the image of the fillet joint is processed using the arm method. Positions of weld seams defined in the CAD database are transformed into the robot coordinate, and the dynamic programming technique is applied to find the sub-optimum weld path. Experiments are carried out to verify the system performance. The results show that the proposed automatic teaching system performs successfully and can be applied to the robot system in the subassembly process.

  • PDF

Recognition of Patterns and Marks on Monitor Glass Panel

  • Ahn, In-Mo;Kang, Dong-Joong;Lee, Kee-Sang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.99.2-99
    • /
    • 2002
  • Contents 1 In this paper a machine vision system for recognizing and classifying the patterns and marks engraved by die molding or laser marking on glass panel of computer monitor is suggested and evaluated experimentally. The vision system is equipped with a neural network based pattern classifier and searching process based on normalized grayscale correlation and adaptive binarization, which is applicable to the cases in which the segmentation of the pattern area from background using the ordinary blob coloring technique is quite difficult. Inspection process is accomplished via the way of NGC hypothesis and ANN verification. The proposed pattern recognition system is composed of three...

  • PDF

Radar, Vision, Lidar Fusion-based Environment Sensor Fault Detection Algorithm for Automated Vehicles (레이더, 비전, 라이더 융합 기반 자율주행 환경 인지 센서 고장 진단)

  • Choi, Seungrhi;Jeong, Yonghwan;Lee, Myungsu;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.9 no.4
    • /
    • pp.32-37
    • /
    • 2017
  • For automated vehicles, the integrity and fault tolerance of environment perception sensor have been an important issue. This paper presents radar, vision, lidar(laser radar) fusion-based fault detection algorithm for autonomous vehicles. In this paper, characteristics of each sensor are shown. And the error of states of moving targets estimated by each sensor is analyzed to present the method to detect fault of environment sensors by characteristic of this error. Each estimation of moving targets isperformed by EKF/IMM method. To guarantee the reliability of fault detection algorithm of environment sensor, various driving data in several types of road is analyzed.

Objective Assessment of Visual Quality and Ocular Scattering Based on Double-pass Retinal Images in Refractive-surgery Patients and Emmetropes

  • Kim, Jeong-mee
    • Current Optics and Photonics
    • /
    • v.3 no.6
    • /
    • pp.597-604
    • /
    • 2019
  • This study was performed to evaluate objective visual quality and ocular scattering in myopic refractive-surgery patients, compared to emmetropes. Optical vision-quality parameters (modulation transfer function (MTF) cutoff and Strehl ratio) and objective scattering index (OSI) were measured using an optical quality analysis system (OQAS II) based on the double-pass technique. In all subjects, the higher the MTF cutoff and Strehl ratio, the lower the OSI and ocular higher-order aberrations (HOAs). The MTF cutoff and Strehl ratio for the laser-assisted subepithelial keratectomy (LASEK) group were lower than those for the emmetropia group, while the OSI, ocular HOAs, and spherical aberration (SA) for the LASEK group were higher than those for emmetropia group. Ocular scattering would be one of the important factors in regard to visual quality. Therefore, the quality of the retinal image in the LASEK patients has been shown to reduce the quality of vision more than in the emmetropes.

Object Recognition-based Global Localization for Mobile Robots (이동로봇의 물체인식 기반 전역적 자기위치 추정)

  • Park, Soon-Yyong;Park, Mignon;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.1
    • /
    • pp.33-41
    • /
    • 2008
  • Based on object recognition technology, we present a new global localization method for robot navigation. For doing this, we model any indoor environment using the following visual cues with a stereo camera; view-based image features for object recognition and those 3D positions for object pose estimation. Also, we use the depth information at the horizontal centerline in image where optical axis passes through, which is similar to the data of the 2D laser range finder. Therefore, we can build a hybrid local node for a topological map that is composed of an indoor environment metric map and an object location map. Based on such modeling, we suggest a coarse-to-fine strategy for estimating the global localization of a mobile robot. The coarse pose is obtained by means of object recognition and SVD based least-squares fitting, and then its refined pose is estimated with a particle filtering algorithm. With real experiments, we show that the proposed method can be an effective vision- based global localization algorithm.

  • PDF

A Study on a Visual Sensor System for Weld Seam Tracking in Robotic GMA Welding (GMA 용접로봇용 용접선 시각 추적 시스템에 관한 연구)

  • 김동호;김재웅
    • Journal of Welding and Joining
    • /
    • v.19 no.2
    • /
    • pp.208-214
    • /
    • 2001
  • In this study, we constructed a visual sensor system for weld seam tracking in real time in GMA welding. A sensor part consists of a CCD camera, a band-pass filter, a diode laser system with a cylindrical lens, and a vision board for inter frame process. We used a commercialized robot system which includes a GMA welding machine. To extract the weld seam we used a inter frame process in vision board from that we could remove the noise due to the spatters and fume in the image. Since the image was very reasonable by using the inter frame p개cess, we could use the simplest way to extract the weld seam from the image, such as first differential and central difference method. Also we used a moving average method to the successive position data or weld seam for reducing the data fluctuation. In experiment the developed robot system with visual sensor could be able to track a most popular weld seam. such as a fillet-joint, a V-groove, and a lap-joint of which weld seam include planar and height directional variation.

  • PDF

Information Fusion of Cameras and Laser Radars for Perception Systems of Autonomous Vehicles (영상 및 레이저레이더 정보융합을 통한 자율주행자동차의 주행환경인식 및 추적방법)

  • Lee, Minchae;Han, Jaehyun;Jang, Chulhoon;Sunwoo, Myoungho
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.1
    • /
    • pp.35-45
    • /
    • 2013
  • A autonomous vehicle requires improved and robust perception systems than conventional perception systems of intelligent vehicles. In particular, single sensor based perception systems have been widely studied by using cameras and laser radar sensors which are the most representative sensors for perception by providing object information such as distance information and object features. The distance information of the laser radar sensor is used for road environment perception of road structures, vehicles, and pedestrians. The image information of the camera is used for visual recognition such as lanes, crosswalks, and traffic signs. However, single sensor based perception systems suffer from false positives and true negatives which are caused by sensor limitations and road environments. Accordingly, information fusion systems are essentially required to ensure the robustness and stability of perception systems in harsh environments. This paper describes a perception system for autonomous vehicles, which performs information fusion to recognize road environments. Particularly, vision and laser radar sensors are fused together to detect lanes, crosswalks, and obstacles. The proposed perception system was validated on various roads and environmental conditions with an autonomous vehicle.

Vision Based Position Detection System of Used Oil Filter using Line Laser (라인형 레이저를 이용한 비전기반 차량용 폐오일필터 검출 시스템)

  • Xing, Xiong;Song, Un-Ji;Choi, Byung-Jae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.3
    • /
    • pp.332-336
    • /
    • 2010
  • There are so many successful applications to image processing systems in industries. In this study we propose a position detection system for used oil filter by using a line laser. We have been done on the development of line laser as interaction devices. A camera captures images of a display surface of a used oil filter and then a laser beam location is extracted from the captured image. This image is processed and used as a cursor position. We also discuss an algorithm that can distinguish the front part and rear part. In particular we present a robust and efficient linear detection algorithm that allows us to use our system under a variety lighting conditions, and allows us to reduce the amount of image parsing required to find a laser position by an order of magnitude.

Fast Laser Triangular Measurement System using ARM and FPGA (ARM 및 FPGA를 이용한 고속 레이저 삼각측량 시스템)

  • Lee, Sang-Moon
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.8 no.1
    • /
    • pp.25-29
    • /
    • 2013
  • Recently ARM processor's processing power has been increasing rapidly as it has been applied to consumer electronics products. Because of its computing power and low power consumption, it is used to various embedded systems.( including vision processing systems.) Embedded linux that provides well-made platform and GUI is also a powerful tool for ARM based embedded systems. So short period to develop is one of major advantages to the ARM based embedded system. However, for real-time date processing applications such as an image processing system, ARM needs additional equipments such as FPGA that is suitable to parallel processing applications. In this paper, we developed an embedded system using ARM processor and FPGA. FPGA takes time consuming image preprocessing and numerical algorithms needs floating point arithmetic and user interface are implemented using the ARM processor. Overall processing speed of the system is 60 frames/sec of VGA images.

A New Robotic 3D Inspection System of Automotive Screw Hole

  • Baeg, Moon-Hong;Baeg, Seung-Ho;Moon, Chan-Woo;Jeong, Gu-Min;Ahn, Hyun-Sik;Kim, Do-Hyun
    • International Journal of Control, Automation, and Systems
    • /
    • v.6 no.5
    • /
    • pp.740-745
    • /
    • 2008
  • This paper presents a new non-contact 3D robotic inspection system to measure the precise positions of screw and punch holes on a car body frame. The newly developed sensor consists of a CCD camera, two laser line generators and LED light. This lightweight sensor can be mounted on an industrial robot hand. An inspection algorithm and system that work with this sensor is presented. In performance evaluation tests, the measurement accuracy of this inspection system was about 200 ${\mu}m$, which is a sufficient accuracy in the automotive industry.