• Title/Summary/Keyword: autonomous welding robot

Search Result 5, Processing Time 0.017 seconds

Environment Modeling for Autonomous Welding Robotus

  • Kim, Min-Y.;Cho, Hyung-Suk;Kim, Jae-Hoon
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.2
    • /
    • pp.124-132
    • /
    • 2001
  • Autonomous of welding process in shipyard is ultimately necessary., since welding site is spatially enclosed by floors and girders, and therefore welding operators are exposed to hostile working conditions. To solve this problem, a welding robot that can navigate autonomously within the enclosure needs to be developed. To achieve the welding ra나, the robotic welding systems needs a sensor system for the recognition of the working environments and the weld seam tracking, and a specially designed environment recognition strategy. In this paper, a three-dimensional laser vision system is developed based on the optical triangulation technology in order to provide robots with work environmental map. At the same time a strategy for environment recognition for welding mobile robot is proposed in order to recognize the work environment efficiently. The design of the sensor system, the algorithm for sensing the structured environment, and the recognition strategy and tactics for sensing the work environment are described and dis-cussed in detail.

  • PDF

Visual Sensor Design and Environment Modeling for Autonomous Mobile Welding Robots (자율 주행 용접 로봇을 위한 시각 센서 개발과 환경 모델링)

  • Kim, Min-Yeong;Jo, Hyeong-Seok;Kim, Jae-Hun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.9
    • /
    • pp.776-787
    • /
    • 2002
  • Automation of welding process in shipyards is ultimately necessary, since the welding site is spatially enclosed by floors and girders, and therefore welding operators are exposed to hostile working conditions. To solve this problem, a welding mobile robot that can navigate autonomously within the enclosure has been developed. To achieve the welding task in the closed space, the robotic welding system needs a sensor system for the working environment recognition and the weld seam tracking, and a specially designed environment recognition strategy. In this paper, a three-dimensional laser vision system is developed based on the optical triangulation technology in order to provide robots with 3D work environmental map. Using this sensor system, a spatial filter based on neural network technology is designed for extracting the center of laser stripe, and evaluated in various situations. An environment modeling algorithm structure is proposed and tested, which is composed of the laser scanning module for 3D voxel modeling and the plane reconstruction module for mobile robot localization. Finally, an environmental recognition strategy for welding mobile robot is developed in order to recognize the work environments efficiently. The design of the sensor system, the algorithm for sensing the partially structured environment with plane segments, and the recognition strategy and tactics for sensing the work environment are described and discussed with a series of experiments in detail.

Hand-Eye Laser Range Finder based Welding Plane Recognition Method for Autonomous Robotic Welding (자동 로봇 용접을 위한 Hand-Eye 레이저 거리 측정기 기반 용접 평면 인식 기법)

  • Park, Jae Byung;Lee, Sung Min
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.9
    • /
    • pp.307-313
    • /
    • 2012
  • This paper proposes a hand-eye laser range finder (LRF) based welding plane recognition method for autonomous robotic welding. The robot welding is the process of joining a metal piece and the welding plane along the welding path predefined by the shape of the metal piece. Thus, for successful robotic welding, the position and direction of the welding plane should be exactly detected. If the detected position and direction of the plane is not accurate, the autonomous robotic welding should fail. For precise recognition of the welding plane, a line on the plane is detected by the LRF. For obtaining the line on the plane, the Hough transform is applied to the obtained data from the LRF. Since the Hough transform is based on the voting method, the sensor noise can be reduced. Two lines on the plane are obtained before and after rotation of the robot joint, and then the direction of the plane is calculated by the cross product of two direction vectors of two lines. For verifying the feasibility of the proposed method, the simulation with the robot simulator, RoboticsLab developed by Simlab Co. Ltd., is carried out.

Autonomous Calibration of a 2D Laser Displacement Sensor by Matching a Single Point on a Flat Structure (평면 구조물의 단일점 일치를 이용한 2차원 레이저 거리감지센서의 자동 캘리브레이션)

  • Joung, Ji Hoon;Kang, Tae-Sun;Shin, Hyeon-Ho;Kim, SooJong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.2
    • /
    • pp.218-222
    • /
    • 2014
  • In this paper, we introduce an autonomous calibration method for a 2D laser displacement sensor (e.g. laser vision sensor and laser range finder) by matching a single point on a flat structure. Many arc welding robots install a 2D laser displacement sensor to expand their application by recognizing their environment (e.g. base metal and seam). In such systems, sensing data should be transformed to the robot's coordinates, and the geometric relation (i.e. rotation and translation) between the robot's coordinates and sensor coordinates should be known for the transformation. Calibration means the inference process of geometric relation between the sensor and robot. Generally, the matching of more than 3 points is required to infer the geometric relation. However, we introduce a novel method to calibrate using only 1 point matching and use a specific flat structure (i.e. circular hole) which enables us to find the geometric relation with a single point matching. We make the rotation component of the calibration results as a constant to use only a single point by moving a robot to a specific pose. The flat structure can be installed easily in a manufacturing site, because the structure does not have a volume (i.e. almost 2D structure). The calibration process is fully autonomous and does not need any manual operation. A robot which installed the sensor moves to the specific pose by sensing features of the circular hole such as length of chord and center position of the chord. We show the precision of the proposed method by performing repetitive experiments in various situations. Furthermore, we applied the result of the proposed method to sensor based seam tracking with a robot, and report the difference of the robot's TCP (Tool Center Point) trajectory. This experiment shows that the proposed method ensures precision.