• Title/Summary/Keyword: Laser Vision

Search Result 334, Processing Time 0.027 seconds

The Design of Controller for Unlimited Track Mobile Robot

  • Park, Han-Soo;Heon Jeong;Park, Sei-Seung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.41.6-41
    • /
    • 2001
  • As autonomous mobile robot become more widely used in industry, the importance of navigation system is rising, But eh primary method of locomotion is with wheels, which cause man problems in controlling tracked mobile robots. In this paper, we discuss the used navigation control of tracked mobile robots with multiple sensors. The multiple sensors are composed of ultrasonic wave sensors and vision sensors. Vision sensors gauge distance using a laser and create visual images, to estimate robot position. The 80196 is used at close range and the vision board is used at long range. Data is managed in the main PC and management is distributed to ever sensor. The controller employs fuzzy logic.

  • PDF

Tool Wear Monitoring with Vision System by Block Processing (화상의 블럭처리기법을 이용한 공구마멸 측정기술)

  • Lee, Sang-Jo;Cho, Chang-Yeon;Lee, Jong-Hang
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.10 no.3
    • /
    • pp.81-86
    • /
    • 1993
  • It is well known that the interest on the on-line sensing of tool wear is growing more and more with the aim of controlling machine tools productivity from the point of view of quality. This paper describes the sensing of the amount of flank wear with vision system. To obtain a proper image He-Ne laser generator is used as the lighting source and obtained image is processed with block processing algorithm and morphological image processing method. By means of this system it is possible to evaluate the parameters of tool wear. Experimental tests performed with this system on an NC lathe have shown good performances here described and discussed.

  • PDF

Autonomous Calibration of a 2D Laser Displacement Sensor by Matching a Single Point on a Flat Structure (평면 구조물의 단일점 일치를 이용한 2차원 레이저 거리감지센서의 자동 캘리브레이션)

  • Joung, Ji Hoon;Kang, Tae-Sun;Shin, Hyeon-Ho;Kim, SooJong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.2
    • /
    • pp.218-222
    • /
    • 2014
  • In this paper, we introduce an autonomous calibration method for a 2D laser displacement sensor (e.g. laser vision sensor and laser range finder) by matching a single point on a flat structure. Many arc welding robots install a 2D laser displacement sensor to expand their application by recognizing their environment (e.g. base metal and seam). In such systems, sensing data should be transformed to the robot's coordinates, and the geometric relation (i.e. rotation and translation) between the robot's coordinates and sensor coordinates should be known for the transformation. Calibration means the inference process of geometric relation between the sensor and robot. Generally, the matching of more than 3 points is required to infer the geometric relation. However, we introduce a novel method to calibrate using only 1 point matching and use a specific flat structure (i.e. circular hole) which enables us to find the geometric relation with a single point matching. We make the rotation component of the calibration results as a constant to use only a single point by moving a robot to a specific pose. The flat structure can be installed easily in a manufacturing site, because the structure does not have a volume (i.e. almost 2D structure). The calibration process is fully autonomous and does not need any manual operation. A robot which installed the sensor moves to the specific pose by sensing features of the circular hole such as length of chord and center position of the chord. We show the precision of the proposed method by performing repetitive experiments in various situations. Furthermore, we applied the result of the proposed method to sensor based seam tracking with a robot, and report the difference of the robot's TCP (Tool Center Point) trajectory. This experiment shows that the proposed method ensures precision.

Development of Fiber-end-cap Fabrication Equipment (대구경 광섬유 엔드캡 제작장비 개발)

  • Lee, Sung Hun;Hwang, Soon Hwi;Kim, Tae Kyun;Yang, Whan Seok;Yoon, Yeong Gap;Kim, Seon Ju
    • Korean Journal of Optics and Photonics
    • /
    • v.32 no.2
    • /
    • pp.49-54
    • /
    • 2021
  • In this paper, we design and construct the equipment to manufacture large-diameter optical fiber end caps, which are the core parts of high-power fiber lasers, and we fabricate large-diameter optical fiber end caps using the home-made equipment. This equipment consists of a CO2 laser as a fusion-splice heat source, a precision stage assembly for transferring the position of a large-diameter optical fiber and an end cap, and a vision system used for alignment when the fusion splice is interlocked with the stage assembly. The output of the laser source is interlocked with the stage assembly to control the output, and the equipment is manufactured to align the polarization axis of the large-diameter polarization-maintaining optical fiber with the vision system. Optical fiber end caps were manufactured by laser fusion splicing of a large-diameter polarization-maintaining optical fiber with a clad diameter of 400 ㎛ and an end cap of 10×5×2 ㎣ (W×D×H) using home-made equipment. Signal-light insertion loss, polarization extinction ratio, and beam quality M2 of the fabricated large-diameter optical fiber end caps were measured to be 0.6%, 16.7 dB, and 1.21, respectively.

Cylindrical Object Recognition using Sensor Data Fusion (센서데이터 융합을 이용한 원주형 물체인식)

  • Kim, Dong-Gi;Yun, Gwang-Ik;Yun, Ji-Seop;Gang, Lee-Seok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.8
    • /
    • pp.656-663
    • /
    • 2001
  • This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.

  • PDF