1. Introduction
There is an increasing demand for the laser welding in automotive industry. Compared with the spot welding, the laser welding technology enables the car body to have a strong weld and light weight. Recently, the robot systems have been used in the laser welding manufacturing and they are equipped with a laser source and an optical fiber. In addition, a real-time tracking is demanded in industries.
In order to ensure a good quality in the welding process, the position error between the welding seam and the focal point of the laser beam should be maintained within 200 μm (because the beam size at focal point is less than 500 μm). Although the welding jig fixes the car body tightly, the seam line may be distorted due to misalignment and modeling errors. Therefore, the tracking technology of the seam line is needed to adjust the position of the laser welding head by detecting the exact seam line.
Many kind of sensory systems exist and the majority of available sensors make use of optical means (imaging sensors, structured or other types of light) and triangulation methods. For the welding purpose, there are several commercial products [1]. However, the price is still high and the working distance is normally fixed according to the applications. Particularly, in case that the seam line is not given prior or distorted, the change of the working distance is needed.
The purpose of this study is to make the intelligent welding profile sensor for the robotic laser welding system. Especially, we focus on the implementation of profile measurement and seam tracking of the sensor. The developed welding profile sensor is applied to the laser welding robot system. The sensor guided control scheme of the whole system will be also presented.
This paper is organized as follows. In section II, the development of the intelligent vision sensor is represented. Both the design of the sensor and the image processing algorithms are explained. The detection of the gap and seam is also given. In section III, the laser welding robot system is described. The three dimensional robot welding system and its components are introduced. The overall control architecture is also illustrated. In section IV, some experimental results are shown. Finally, section V concludes this paper.
2. Intelligent Vision Sensor
A new laser-stripe sensor is developed. It consists of an IEEE1394 camera and a stripe laser diode. It is designed to change the angle between the camera and the laser diode so that it can adjust the working distance. The camera parameters such as zoom and focus can be also adjusted accordingly. To reduce other interferences, the appropriate bandpass filter is adopted in front of the lens. The developed system is shown in Fig. 1
Fig. 1 Developed welding profile sensor
The image processing for the profile and the seam is performed on the PC. This includes several tasks such as thinning, calibration and other recognitions. The detailed explanation will be given in the next section.
A. Profile Measurement
The first task to get the profile is to extract the surface points from the laser line image. This process is referred as thinning or skeletonization. Thinning is an image process to find center lines from the integer valued image regions. Numerous algorithms such as row based maximum search, threshold methods, center of gravity, correlation techniques were suggested for thinning. As studied in [2], the center of gravity (COG) method gives us the best accuracy among those methods.
Let the horizontal and vertical image coordinates be the u and v, respectively. By using COG, the center coordinate at each horizontal position is given as
\(\begin{align}{ }^{v} \operatorname{COG}(u)=\frac{\sum_{v} I(u, v) \times v}{\sum_{v} I(u, v)}\end{align}\) (1)
where I (u, v) is the intensity at (u, v). In typical industrial applications where well scattering parts mix with high reflective or corroded zones, the maximum detection is needed to differentiate the main peak from others. Briefly, maximum detection is followed by the center of gravity around the maximum in order to extract the profile from images.
In addition, there exist small sensor noises in the entire sensing region. This makes the result of COG converge to the middle of the sensing region regardless of the real position of the peak. To alleviate this problem, thresholding is used to cut off the noise which is lower than the given value (Thhigh). Besides, this can reduce the total computation of COG. The resultant coordinate (vCOG) is examined whether it actually lies in the laser line regions. The coordinate is considered as a valid one if the intensity at the coordinate is higher than another threshold (Thlow).
Thinning process is illustrated in Fig. 2. The upper left one shows the laser line image and the lower left shows the obtained center lines. The right one shows the intensity profile in the vertical direction. Two thresholds are defined as shown and the coordinate is computed from the values above the high threshold.
Fig. 2 Center of gravity (COG) with thresholding
Above process is for the vertical direction only. Depending on the material and refl ection condition of the surface, the obtained center lines often have irregular surfaces. The surface profile shows he difference of several pixels between adjacent coordinates in the horizontal direction. In most cases, the smooth profile is expected. For this purpose, the averaging window is used. The coordinate is computed in an averaging window of which size is N by considering the closest (N −1)/2 neighbors of each horizontal position u. Fig. 3 depicts the smoothing process whose window size is 3.
Fig. 3 Averaging window for smoothing
The size of window affects the characteristic of the profile measurement. A small window gives a bumpy surface, but the localization of the abrupt depth is better. On the other hand, a large window gives a smooth surface, but it is hard to localize the gap or step. Fig. 4 shows the averaging results when the sizes of window are 3, 11, and 21, respectively. We can get the best result with N =3, because the gap and seam detection is relatively important in the welding application.
Fig. 4 Averaging results with different window sizes
Finally, the camera calibration is needed to convert the profile points in image plane to the points in real Cartesian coordinates. Because the profile points lie in the triangular light plane projected by the laser, the 3D coordinate of those can be obtained uniquely. The relation between two coordinates is represented as direct linear transformation (DLT) [3] and its coefficients can be determined by corresponding points. To give the corresponding points in each space, the calibration block having stair steps is used [4]. We also used the Harris corner detector which helps the selection of the corresponding points.
The calibration procedure should be performed with each configuration for the sensor to operate at different working distances. The desired accuracy is 200 μm, and thus the camera should have about 20mm field of view in the depth direction. The working distances can vary from 100mm to 300mm, and 5 uniformly separated configurations are considered. The relative angle between the camera and the laser diode and the lens parameters are adjusted appropriately. The relative angle is determined so that the nominal profile lies at the middle of the image plane. The zoom factor is selected first in order to view the almost same physical depth range and the focus is determined to get the clearest laser image. After the calibration procedures of each configuration are performed, all parameters are stored in the memory. The configuration is selected automatically according to the average depth of the last profile scanning. To avoid oscillatory changes between two neighboring configuration, the hysteresis is used near the boundaries.
B. Gap and Seam Extraction
The gap and seam extractions are critical operations in the robot laser welding, because the welding quality is directly infl uenced by them. The shape shown in Fig. 4 is an example with a gap, and we want to find the width of a gap. The seam detection can be varied according to the joints: butt, lab, V-groove, fillet joints and so on. However, the principle is the same for all cases. In this research, the butt joint is considered because it is common in automobile laser welding and the detection of it is relatively difficult. In both gap and seam detections, an elementary task is to recognize the difference in the depth profile. This task is similar to the edge detection in the image processing if the depth is corresponding to the intensity.
The simplest method for the edge detection is to find the depth differences between adjacent positions. Generally, this is done by applying a mask to the depth profile. Finally, the maximum (minimum) position is extracted as a rising (falling) edge. Among numerous edge detectors, Canny edge detector is most popular because it gives the best results in many applications [5]. The edge is found so that three criteria, i.e., good detection, good localization and uniqueness of response, are optimized. Generally, it can be said that the mask which resembles the shape of the edge itself is the optimal filter. For the step edge in the noise free environment, the difference of boxes is the optimal mask. For the step edge in the noisy environment, it is hard to find the extreme position with the difference of boxes because several extrema occur. In this case, another type of mask is the optimal and the difference of Gaussian (DOG) is used to approximate it.
In this research, the DOG operator is used to find the step edge. To estimate the gap, the distance between rising and falling edges is calculated. For the seam detection, the edge detection method can be also utilized. The matched filter is applied to the profile data according to the shape of the joint. Theoretically, it is impossible to localize the seam of a butt joint. However, the cutting edges of the joints hardly match each other exactly and even diffused reflections occur at the joint. In addition, the jig cannot fix the joints perfectly. In Fig. 5, a sample image of the butt joint is given. Minute differences can be found at the seam location and the thickness of the laser line is smaller than other regions. Therefore, the recognition of the narrow laser line as well as the edge detection is used for the seam detection of a butt joint. The detected seam location is also indicated.
Fig. 5 Seam detection of a butt joint
3. LASER WELDING ROBOT SYSTEM
The developed welding profile sensor is applied to the laser welding robot system. The robot, the seam tracking system and CW Nd:YAG laser are used for three dimensional robot laser welding as shown in Fig. 6. A 4kW Nd:YAG laser and the 6 axes industrial robot are used in this study. The precise positioning (<200 μm) of the laser beam on the joint to be assembled is achieved by the seam tracker. In this system, the developed welding profile sensor is substituted for the previous seam tracker. Butt and lap joints are considered as the welding joints of car body.
Fig. 6 Three dimensional laser welding
The welding seam is not straight but slightly distorted due to some difficulty resulting from the cutting the strong and thin steel plate longitudinally. And some thermal deformation can take place during welding process. Therefore, it is necessary to develop the seam tracking facility and the welding apparatus to solve such problems. Fig. 7 shows the total system configuration of sensor guided laser welding robot system.
Fig. 7 Sensor guided laser welding robot system
The seam tracking sensors can be used for teaching of seam locations prior to laser welding and for real-time tracking during laser welding. In real-time tracking application, the sensor has to measure at least some distance ahead of the laser focal point to prevent disturbances caused by the welding process. Positional errors of the laser focal position with respect to the weld seam significantly depend on the look-ahead distance. The information about the seam line is sent to the main controller. Then, the corrected trajectories are generated and the motion commands are transmitted to the robot controller.
A. Control Architecture
Previously, the robot trajectories are taught by a human operator. The seam locations are given with the operation panel. In this research, the seam locations are not given prior and the laser-stripe sensor guides the robot to track the seam line.
For the experiment, the robot just moves in the welding direction following a straight line. The control architecture to track the seam line is done in the inner control loop shown in Fig. 8. The robot position, the profile, and seam information are obtained. From these, the robot trajectories are generated. After all, the desired motion commands are sent to the robot controller. The control is synchronized with the operation of the stripe sensor and the control interval is about 70 ms. After the motion commands are transferred to the robot controller, it takes tens of milliseconds to track the given position perfectly. Therefore, it is needed to compensate this delay.
Fig. 8 Control architecture
The sensor guided control methods were proposed by several researchers[6],[7]. The operating frequency of the sensor, the look-ahead-distance, and the synchronization affect the tracking performance. In this study, a predictive control scheme similar to the trajectory-based control [6] is used. Fig. 9 depicts this control scheme.
Fig. 9 Sensor guided control scheme
The sensor is attached away from the welding position by the look-ahead-distance. The sensor moves in front of the welding head and the seam locations are captured in advance. The seam locations are obtained at each horizontal interval and stored in the seam location buffer. The current velocity of the robot is measured and the average velocity is updated as:
vavr (k) = αvcurrent (k) + (1 − α)vavr(k − 1) (2)
where α is a constant with 0≤α≤ 1.
With the average velocity of the robot, the predicted position of the robot in the welding direction is estimated as:
xpred (k) = xcurrent (k) + kvavr (k) (3)
where k is a prediction step in the unit of the control interval. Note that the predicted position in the welding direction should be located before the last seam location in the welding direction and the increased sampling time ensures this condition. Finally, the seam location (or deviation) at the predicted position is computed. The linear interpolation is used to find the seam location which is not in the seam location buffer. With this simple predictive scheme, the tracking control of the seam location is achieved.
4. EXPERIMENTS
In this section, the measurement accuracy is evaluated from several experiments. The experiments include the profile measurement, the gap measurement, and the seam tracking test.
First, the profile of the calibration block has been measured. The calibration block is designed to have stair steps and used to calibrate the laser-stripe sensor. The exact dimension is known and the measurement accuracy can be evaluated. The robot moves at the velocity of 0.6 m/min and the profile measurement is performed at every 1 mm. The aggregated result is shown in Fig. 10. The measurement accuracy in depth is about 120 μm.
Fig. 10 Profile measurement of the calibration block
Several objects including a calibration block, a lap joint, and an object with holes, has been scanned using the robot system. The sample objects and their obtained profiles are shown in Fig. 11. It is shown that the profiles have quite smooth surfaces. However, there exists the region of which profile data cannot be obtained. This is because the laser diode declines and some profiles in the shaded region cannot be processed. The interpolation technique can be used to fill the absent profiles in this case.
Fig. 11 Sample specimens and their profiles
Second, the gap measurement experiment has been performed. The specimen was made for this test and it has the various gaps of which sizes are varying from 0.2 mm to 1.0 mm. The specimen is shown in Fig. 12.
Fig. 12 Specimen with various gap widths
The results of the gap measurement are shown in Fig. 13.
Fig. 13 Gap measuremen
The experiment has been performed with different welding speeds (0.6 m/min and 3.0 m/min). The root mean squared errors (RMSE) of two cases are 48 μm and 35 μm, respectively. With the increased speed, there exist more vibration which degrades the accuracy. The averaging window in depth plays an important role in the robust estimation. The accuracy of the gap measurement is better than that of the profile measurement because the horizontal resolution is higher than the vertical resolution.
Finally, the S-shaped seam tracking has been carried out. Fig. 14 shows the S-shaped butt joint and its dimension. Considering most shapes of joints in automobile parts are straight lines, the tracking of the S-shaped butt joint is a complex task. In this experiment, the actual seam line is not given prior as mentioned earlier and the robot starts to follow a straight line. The seam line is scanned with the laser-stripe sensor and the trajectory of the robot is modified. The experiments were performed with different welding speeds (1.2 m/min and 2.4 m/min). The tracking results are shown in Fig. 15. Also, the increased speed degrades the tracking performance, but the resulting accuracy of the seam locations is well within 200 μm for both cases. Fig. 16 shows the still images taken during the tracking of the S-shaped seam. The spot indicates the virtual welding position and it is found to track the seam locations with a good accuracy.
Fig. 14 S-shaped seam line and its dimension
Fig. 15 Seam tracking results
Fig. 16 Still images of S-shaped seam tracking
5. Conclusion
In this paper, the development of intelligent vision sensor for the robotic laser welding is presented. A stripe-laser sensor is newly designed to operate at different working distances. To measure the 3D profile data correctly, the thinning based on COG and the thresholding are implemented. For the robust estimation, the averaging methods are used as well. The detection of the gap and seam is achieved from the obtained depth profile.
The developed welding profile sensor is applied to the laser welding robot system and the sensor is substituted for the commercial seam tracker. With the guide of the sensor, the robot can follow the seam of the welding part. Finally, a simple predictive control scheme is used for the tracking control. The measurement and tracking accuracy are evaluated from various experiments.
For the future works, the robust technique to fill the occluded region will be implemented. Also, to make the image processing procedures more efficient and more robust is important to enhance the tracking performance.
Acknowledgments
This work is supported by the Intelligent Robot R&D Project (New Growth Engine of Korea, Ministry of Commerce, Industry and Energy.
References
- J. Forest and J. Salvi, "A review of laser scanning three-dimensional digitisers," Proc. IEEE Conf. on Intelligent Robots and Systems (IROS'02), pp. 73-78, Lausanne, Switzerland, October 2002.
- K. Haug and G. Pritschow, "Robust laser-stripe sensor for automated weld-seam-tracking in theshipbuilding industry," Proc. Conf. of the IEEE Industrial Electronics Society (IECON'98), pp. 1236-1241, Aachen, Ger- many, August 1998.
- Y.I. Abdel-Aziz and H.M. Karara, "Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry," Proc. the Symposium on Close-Range Photogrammetry, pp. 1-18, 1971.
- Z. Zhang, "A flexible new technique for camera calibration," IEEE Trans. Pattern Anal. Mach. Intell., Vol.22, No.11, pp. 1330-1334, 2000. https://doi.org/10.1109/34.888718
- J. Canny, "A computational approach to edge detection," IEEE Trans. Pattern Anal. Mach. Intell., Vol. 8, No. 6, pp. 679-698, 1986. https://doi.org/10.1109/TPAMI.1986.4767851
- M.W. Graaf, R.G.K.M. Aarts, J. Meijer, and J.B. Jonker, "Robot-sensor synchronization for real-time seam-tracking in robotic laser welding," Proc. WLT-Conf. on Lasers in Manufacturing, pp. 419-424, Munich, Germany, June 2005.
- L. Zhou, T. Lin and S.B. Chen, "Autonomous acquisition of seam coordinates for arc welding robot based on visual servoing," Journal of Intelligent and Robotic Systems, Vol. 47, No. 3, pp. 239-255, 2006. https://doi.org/10.1007/s10846-006-9078-9