Next Article in Journal
Analysis of the Nosema Cells Identification for Microscopic Images
Next Article in Special Issue
Soft-Tentacle Gripper for Pipe Crawling to Inspect Industrial Facilities Using UAVs
Previous Article in Journal
A Particle Filtering Approach for Fault Detection and Isolation of UAV IMU Sensors: Design, Implementation and Sensitivity Analysis
Previous Article in Special Issue
Optimization Design and Flexible Detection Method of a Surface Adaptation Wall-Climbing Robot with Multisensor Integration for Petrochemical Tanks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Seam Tracking Technique with a Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam

1
Guangzhou Institute of Advanced Technology, Chinese Academy of Sciences, Guangzhou 511458, China
2
School of Engineering Science, University of Chinese Academy of Sciences, Beijing 100049, China
3
School of Construction Machinery, Chang’an University, Xi’an 710064, China
4
Department of Robot Engineering, ERICA Campus, Hanyang University, Seoul 426-791, Korea
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(9), 3067; https://doi.org/10.3390/s21093067
Submission received: 1 April 2021 / Revised: 24 April 2021 / Accepted: 25 April 2021 / Published: 28 April 2021
(This article belongs to the Special Issue Robotic Non-destructive Testing)

Abstract

:
The seam tracking operation is essential for extracting welding seam characteristics which can instruct the motion of a welding robot along the welding seam path. The chief tasks for seam tracking would be divided into three partitions. First, starting and ending points detection, then, weld edge detection, followed by joint width measurement, and, lastly, welding path position determination with respect to welding robot co-ordinate frame. A novel seam tracking technique with a four-step method is introduced. A laser sensor is used to scan grooves to obtain profile data, and the data are processed by a filtering algorithm to smooth the noise. The second derivative algorithm is proposed to initially position the feature points, and then linear fitting is performed to achieve precise positioning. The groove data are transformed into the robot’s welding path through sensor pose calibration, which could realize real-time seam tracking. Experimental demonstration was carried out to verify the tracking effect of both straight and curved welding seams. Results show that the average deviations in the X direction are about 0.628 mm and 0.736 mm during the initial positioning of feature points. After precise positioning, the average deviations are reduced to 0.387 mm and 0.429 mm. These promising results show that the tracking errors are decreased by up to 38.38% and 41.71%, respectively. Moreover, the average deviations in both X and Z direction of both straight and curved welding seams are no more than 0.5 mm, after precise positioning. Therefore, the proposed seam tracking method with four steps is feasible and effective, and provides a reference for future seam tracking research.

1. Introduction

Mechanical robots have become crucial for modern welding owing to high-volume profitability since manual welding yields low production rates [1]. Robotic welding brings different favorable circumstances, for instance, it has made strides in efficiency, weld quality, adaptability and workspace use, and it diminishes work costs in addition to focused unit cost [2].
Be that as it may, most welding robots still work in the working mode of “teach and playback” and their adaptability is not enough when the welding object or other conditions are changed [3]. Since welding as an empirical process is influenced by numerous factors, such as the mistakes of pre-machining, fitting of work pieces, and in-process defects, can result in variation in welding seam. However, welding robots in teach and playback mode have no such capacities and typically weld a weldment with many defects and poor penetration [1].
There are generally three stages in robotic welding: (i) preparation—calibration, robot programming, and weld parameter, work-piece setting, (ii) welding—seam tracking, alternation of weld parameters in real time, (iii) analysis—weld quality inspection [4]. The seam tracking operation is essential for extracting weld seam characteristics which can be fed into the controller of welding robot to instruct the motion of the robot along the welding seam path. Seam tracking technology with laser vision sensing has the advantages of no contact, fast speed, and high precision, which are the keys to realizing welding automation and intelligence [5,6].
In order to fulfill the required welding accuracy for robotic welding, a seam tracking algorithm that enables the robot to plan its path along the actual welding line is necessary. Therefore, many studies have been conducted on automatic seam tracking using sensors such as tactile, touch, probe, vision sensors [7,8], laser sensors [9,10], arc sensors [11,12], electromagnetic sensors [13,14], and ultrasonic sensors [15,16]. The sensors have a very important role in robotic seam tracking; the chief tasks would be weld starting and ending points detection, weld edge detection, joint width measurement.
A basic laser sensor consists of three parts: laser diode, CCD camera, and filter. The laser diode could produce a stripe or dot which would be scanned by the camera. The CCD camera is always fixed at an angle to the laser to capture properly the projection of laser on the work piece [17]. The welding seam tracking system based on laser vision combines laser measurement and computer vision technology. It has the advantages of rich information acquisition, obvious welding seam characteristics, and strong anti-interference ability [18,19], which are suitable for real-time tracking systems. The mathematical model of transforming the laser feature points pixel coordinate to the three-dimensional coordinate of the welding feature points by designing the mechanical structure of the sensor was proposed [20].
Chen et al. [21] proposed a feature points positioning method that only needs two profile scans, which can effectively calculate the initial position of the weld. Chang et al. [22] filtered, derived and convolved the weld profile data, and located the feature points by finding the local maxima. Wang et al. [23] established welding seam profile detection and feature points extracting algorithms based on a NURBS-snake and visual attention model, and verified their effectiveness. Mastui et al. [24] introduced an adaptive welding robot system controlled by laser sensor for welding of thin plates with gap variation in single pass.
In a flexible welding process, Ciszak et al. [25] developed a low-cost system for identifying shapes in order to program industrial robots for a welding process in two dimension. The programming of industrial robots was to detect geometric shapes proposed by humans and to approximate them. Based on this, the robot could weld the same profiles on a two-dimensional plane. This is time-consuming as many welding robot applications are programmed by teach and playback, which means that they need to be reprogrammed each time they deal with a new task. Hairol et al. [26] suggested an alternative approach that can automatically recognize and locate the butt-welding position at starting, middle, auxiliary, and end point under three conditions which are (i) straight, (ii) saw tooth, and (iii) curve joint. This was done without any prior knowledge of the shapes involved. As an automatic welding process may experience different disturbances, Li et al. [27] proposed a robust method for identifying this seam based on cross-modal perception so as to precisely identify and automatically track the welding seam.
Wojciechowski et al. [28] proposed the method of automatic robotic assembly of two or more parts placed without fixing instrumentation and positioning on the pallet, which could support a robotic assembly process based on data from optical 3D scanners. The sequence of operations from scanning to place the parts in the installation position by an industrial robot was developed. Suszynski et al. [29] presented the concept of using an industrial robot equipped with a triangulation scanner in the assembly process in order to minimize the number of clamps that could hold the units in a particular position in space based on the proposed multistep processing algorithm.
These efforts have brought about many improvements in the feature points of the target weldment. However, there are certain limitations in the positioning accuracy due the factors such as the change of the welding type (especially oriented to complex welding seam) or the surface defects of the welding.
Due to these circumstances, we here introduce a novel seam tracking technique with a four-step method. First, a laser sensor is used to scan the groove of the weldment to collect profile data; then the data are processed by a filtering algorithm to smooth the noise; next, the second derivative algorithm is proposed to initially locate the feature points based on linear fitting to accurately locate the feature points; finally, according to the results of the sensor pose calibration, the three-dimensional coordinates in the base coordinate system of the welding robot are calculated from the two-dimensional coordinates of the image feature points, and the path planning is completed, with both the line and curve of the Y-shaped groove being targeted as well. The proposed seam tracking technique is tested and verified by way of experimental investigation.
Our proposed seam tracking technique with a four-step method utilizes edge detection and curvature recognition techniques based on laser scan data. The offset of the welding robot’s motion with respect to the welding seam is measured by a laser sensor. By adding a differential point searching method, the feature points of the cross-section of the welding seam are found. Comparing to other seam tracking algorithms, we show the improvement of the required welding accuracy oriented to complex welding seam through theoretical proof, simulation, and experiments.
This paper is organized as follows: Section 2 presents the seam tracking system composition; Section 3 introduces the seam tracking methodology with four steps; Section 4 shows the results of the experimental investigation based on the proposed seam tracking technique; Section 5 gives the conclusion and perspective.

2. Seam Tracking System Composition

The experimental platform composition of the six-axis robot arm for seam tracking system is detailed in Figure 1. As evident in Figure 1, this experimental platform is mainly composed of the motion execution mechanism with six degrees of freedom, laser vision sensor, D/A conversion module, and industrial computer, robotic controller, welding equipment, i.e., welding power supply and wire feeding device, etc.
The execution mechanism is composed of two welding robots, and each of them has six degrees of freedom. The offset of the welding robot’s motion with respect to the welding seam is measured by a laser vision sensor. Through robotic welding experiments, images of molten pool morphology and welding geometry under different welding parameters can be obtained. The main tasks for seam tracking would be weld starting and ending point detection, weld edge detection, joint width measurement, and weld path position determination with regard to welding robot co-ordinate frame.

3. Seam Tracking Methodology with Four Steps

In this paper, we introduce a novel seam tracking technique with a four-step method: scanning, filtering, feature points extracting, and path planning. Firstly, the profile information is obtained by scanning the groove with a laser sensor; then, the data are filtered to smooth the noise; next, the feature points are extracted by the combination of the second derivative algorithm and linear fitting; finally, the data of the feature points are converted into the welding seam path of the robot, guiding the welding torch to move and realize the real-time tracking of the welding seam. The flowchart of the proposed four-step method is revealed in Figure 2.

3.1. Scanning and Filtering

The purpose of scanning is to obtain the original data of the weldment groove profile, which is the basis for realizing seam tracking [30]. The laser sensor obtains the distance information of the measured object based on the principle of triangulation and then processes the scan data to obtain the profile feature of the measured object. While scanning, the sensor is fixed at the end-effector of the robot and parallel to the welding torch to ensure that the line laser is perpendicular to the measured object [31], covering the groove to the greatest extent, and at the same time, the welding robot is constantly moved to obtain the overall shape of the welding seam.
The combination of limiting filter and Gaussian filter is used to process the groove profile data obtained by scanning. The former is used to remove the pulse interference caused by accidental factors. The latter is used to smooth the data [32]. The data are processed using limiting filtering by comparing the absolute value of the difference between two adjacent sample values and the size of the threshold. Its principle can be expressed as [33]:
y = { y n | y n y n 1 | Δ T y n 1 | y n y n 1 | > Δ T ,
where yn and yn−1 are the current and last sampled signal values, respectively, and ∆T represents the specified threshold.
Gaussian filtering is a type of linear smoothing filtering method that selects weights according to the shape of the Gaussian function. It is very effective in suppressing the noise that obeys the normal distribution [34], and the Gaussian function has good properties of symmetry, differentiability, and integrability. The function can accurately identify the discontinuous points of the signal, which is very beneficial for the subsequent feature points extracting. The expression of the one-dimensional Gaussian function can be described as [35]:
f ( x ) = 1 σ 2 π e ( x μ ) 2 2 σ 2 ,
where μ is the mean value, which determines the position of the function, and σ is the standard deviation, which determines the magnitude of the distribution.

3.2. Feature Point Extracting

The feature points of the weldment are generally the corner points of the groove section, and its information can reflect the overall situation of the groove profile [36], so feature point extracting is required. This is done according to the cross-sectional characteristics of the weldment groove, combined with the related properties of the function discontinuities listed in Table 1. The groove feature points could be classified as follows: A, B, E, F, which are the first type of feature points, and C, D, which are the second type of feature points, as shown in Figure 3.
Based on the above analysis, the feature points can be located by determining the types of feature points contained in the groove section, and then deriving them to find the extreme points.

3.2.1. Initial Positioning of Feature Points

The preliminary positioning method of the groove feature points is as follows: First, the original data are processed by filtering, and then the first derivative is obtained by the forward difference method and the extreme points are found to determine the first type of feature points, as compared in Figure 4. The abscissa and the ordinate, respectively, represent the X and Z axes of the sensor coordinate system.
It can be seen from the above figures that the maximum point of the first-order guide falls between the line segment BC and DE, and fails to accurately correspond to B and E. This is because the groove of the weldment under actual conditions needs to be machined, and its blunt edge is not a vertical line in an ideal state, but a diagonal line. Therefore, the second type of feature points are transformed into the first type, and the first-order derivative can be continued to differ, and the second-order derivative can be obtained and the point with the highest value can be found to locate all the feature points, as shown in Figure 4. So far, the six characteristic points of the trapezoidal groove have been preliminarily determined, and their location information is listed in Table 2.

3.2.2. Precise Positioning of Feature Points

Due to the defects on the surface of the weldment, as given in Figure 5, the feature points obtained through preliminary positioning are b and c, while the true feature point should be a, which is clearly a deviation. Therefore, on the basis of preliminary positioning, linear fitting is performed on each segment of the groove to accurately locate the feature points.
Suppose any straight-line equation to be fitted is y = ax + b, and the calculation of equation parameters can be written as [37]:
[ a b ] = [ i = 1 n x i 2 i = 1 n x i i = 1 n x i n ] 1 [ i = 1 n x i y i i = 1 n y i ] , y = { y n | y n y n 1 | Δ T y n 1 | y n y n 1 | > Δ T
where a is the slope, b is the intercept, (xi, yi) is the point passing through the straight line, and n is the number of points.
The fitting results are shown in Figure 6, and the relevant parameters of the straight line are illustrated in Table 3.
Among them, SSE is the sum variance, which calculates the sum of squared errors between the fitting data and the corresponding points of the original data. The smaller the value, the better the fitting affects; R-squared is the coefficient of determination, which is used to characterize the quality of the fitting [38]; the closer its value is to 1, the better the fitting affects. It is easy to know that the fitting effect of each straight line is better. The results of precise positioning of the feature points are listed in Table 4. So far, the feature points extracting of the profile for the trapezoidal groove section would be completed.

3.3. Path Planning

Because the data measured by the laser sensor are based on their own coordinate system, it is necessary to convert the feature points to the base coordinate system of the welding robot through pose calibration [39].
The relationship between two coordinate systems of the robot is depicted in Figure 7. The sensor calibration is to determine the transformation matrix T S E of {S} relative to {E}.
This paper uses the multipoint method for calibration [40]. The main steps are as follows:
  • Select a point P on the weldment, make the end of the welding torch this point, and record the position of P in the {B} coordinate system BP = (xB, yB, zB, 1)T, as shown in Figure 8a.
  • Move the robot so that the laser line of the sensor passes through this point, and record the position of P in the {S} coordinate system SP = (xS, 0 zS, 1)T, as shown in Figure 8b.
  • Switch the current tool coordinate system of the robot to {E}, record the pose data of the robot at this time, and from the Euler rotation equation, R E B can be expressed as [41]:
    R E B = [ cos α sin α 0 sin α cos α 0 0 0 1 ] [ cos β 0 sin β 0 1 0 sin β 0 cos β ] [ 1 0 0 0 cos γ sin γ 0 sin γ cos γ ] = [ R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 ] ,
    where α, β, γ are the rotation angles of the X, Y, and Z axes of the tool coordinate system {E}, respectively.
Then, T E B can be simplified to
T E B = [ R E B P   E 0 0 0 1 ] ,
where EP = (xE, yE, zE)T, that is, the position of point P in the tool coordinate system {E} after the coordinate system is switched.
According to the transformation relationship of point P in space:
P   B = T E B T S E P   S ,
where the definition of each parameter in the formula is consistent with the above.
Since T S E contains 12 unknowns, at least 3 different fixed points need to be selected to solve the problem. The calibration results in this paper are as follows:
T S E = [ 0.998 0.423 0.590 75.098 0.014 0.278 0.026 6.693 0.002 0.865 0.814 303.131 0 0 0 1 ] ,
At this point, the pose calibration of the sensor is completed. For any known points SQ in its coordinate system, the formula to transform it into the robot base coordinate system can be written as
Q   B = T E B T S E Q   S ,
where BQ and SQ are respectively the position of point Q in the coordinate system {B} and the coordinate system {S}; T S E is the calibration result of Equation (4); the definition and calculation of T S E follow step 3.

4. Experimental Procedures

Experimental demonstration had been carried out at the proposed seam tracking method with four steps to guide the movement of the welding torch under actual testing conditions. Figure 9 reveals the prototype of whole experimental system, which mainly includes ABB IRB 1410 welding robot, IRC5 controller, LS-100CN laser sensor, Ehave CM350 welding power supply, RS-485 communication module, and an industrial computer.
In this paper, two typical weldments with materials of A304 stainless steel are selected as the welding objects, the physical prototypes of two typical welding grooves are illustrated in Figure 10, and the groove parameters of the weldment with straight line and curve are listed in Table 5.
When scanning the welding groove, the laser sensor is set to the trigger mode, and the welding robot is constantly moved to obtain the overall shape characteristics of the welding seam. The process of scanning two typical welding grooves by the laser sensor is represented in Figure 11.
Before the experiment, we mark the starting and ending points of the welding path on the weldment, and then the straight and curved grooves are respectively taught a section of motion trajectory in the model of “teach”, as shown in Figure 10. The red point is the teaching point, which is the position of the end point of the robotic welding torch. Multiple teaching points are connected to form a welding trajectory, and the pose data of the teaching trajectory in the welding torch coordinate system will be recorded simultaneously, which is used as a reference to calculate the experimental deviation.
During the experiment, if the straight groove is taken as an example, let us first move the end-effector of the robot, i.e., the welding torch, along the teaching trajectory. When it reaches reference point L1, as shown in Figure 10a, the laser sensor will be turned on to scan the welding groove and collect data. At the same time, the current tool coordinate system of the welding robot will be switched to the end coordinate system, the position and posture data of the end coordinate system are obtained in real time through the API interface of the welding robot, and the sampling period is consistent with that of the laser sensor.
The welding robot continues to move. When the end of the welding torch moves to reference point L2, as shown in Figure 10a, the laser sensor will be turned off, the data transmission of the API interface is stopped, the data collection is completed. According to the feature points of the groove, the center point of the welding torch is calculated; according to the position and posture data of the end coordinate system obtained by API interface, the trajectory reference point is calculated. Through the calibration matrix of laser sensor (Formula (7)), the position data of the welding torch center point is transformed into the welding robot end coordinate system, and then through the calibration matrix of welding torch, it is transformed into the welding torch coordinate system.
After the above process, the groove data collected by the laser sensor are transformed into the center point data of the robotic welding torch, and the end coordinate system data collected by the API interface are transformed into the trajectory reference point data. The experimental results of two different welding grooves of straight and curved lines with both initial positioning and precise positioning using the proposed seam tracking method are compared in Figure 12.
The accuracy of the feature points positioning method is evaluated by comparing the deviation between the calculated welding center point and the actual welding torch end point. Among them, the average deviation d (mm) represents the average value of the difference between each welding center point and the end point of the welding torch; the deviation degree p (%) indicates the deviation degree of the deviation in this direction relative to the entire groove. The average deviation d (mm) and deviation degree p (%) can be written as:
d x = 1 n i = 1 n ( x t c p ( i ) x t ( i ) ) ,   d z = 1 n i = 1 n ( x t c p ( i ) z t ( i ) ) ,
where dx and dz are the average deviation in the X and Z directions, respectively. xtcp(i) and ztcp(i) are the coordinates of the welding center point, xt(i) and zt(i) are the coordinates of the trajectory reference point, respectively. n is the number of points.
p x = d x l ,   p z = d z h ,
where px and pz are the deviation degrees the in X and Z directions, respectively. l is the total length of the groove, and h is the depth of the groove.
The comparative results of different positioning methods for feature points are depicted in Table 6. As can be seen from the figures and table, the average deviations dx (mm) of the two different welding seams of both straight line and curve in the X direction are relatively large when only initial positioning is carried out. After precise positioning, the average deviations are reduced to 0.387 mm and 0.429 mm, respectively. Experimental procedures show promising results, in that the average deviations display a significant decrease by 38.38% and 41.71%, respectively.
It is worth noting that the average deviations in both X and Z direction of two different welding seams of both straight line and curve after precise positioning are no more than 0.5 mm; this value is defined by Kovacevic et al. [42] and could fulfill the minimum accuracy requirements of robotic welding. Therefore, it is suggested that the proposed seam tracking method with four steps is feasible and effective, and provides a reference for future seam tracking research.

5. Conclusions

A novel seam tracking technique and experimental investigation of robotic welding oriented to complex welding seam are proposed in this study. Conclusions are as follows:
  • A set of seam tracking systems based on laser sensing and visual information extraction is designed, and the method involving scanning, filtering, feature points extracting, and path planning is proposed to realize high-precision seam tracking;
  • The groove information is collected through the laser sensor and the data are filtered, and the corresponding three-dimensional coordinate value in the sensor coordinate system is calculated using the two-dimensional coordinates of the image feature points;
  • The accuracy problem of feature point positioning when the weldment surface has defects is solved. Experimental results show that the average deviations of both straight line and curve of welding feature points after precise positioning is less than 0.5 mm;
  • The experimental errors are mainly caused by the calibration error of the sensor coordinate system and the calculation error of the feature points extracting algorithm. In addition, increasing the resolution of the sensor could further improve the measurement accuracy.

Author Contributions

Conceptualization, G.Z. and Z.H.; methodology, G.Z. and S.T.; software, S.T., Y.Z., and Y.W.; validation, S.T. and Y.Z.; formal analysis, S.T., Z.X.; investigation, G.Z. and Z.H.; resources, S.T. and W.Y.; data curation, Y.Z., S.T., Y.W., and W.Y.; writing—original draft, S.T. and Y.Z.; writing—review and editing, G.Z., Z.H. and K.S.; visualization, S.T.; supervision, G.Z. and Z.H.; project administration, G.Z. and H.Y.; funding acquisition, G.Z. and Z.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded in part by the National Key Research and Development Project of China, grant number 2018YFA0902903, the National Natural Science Foundation of China, grant number 62073092, the Natural Science Foundation of Guangdong Province, grant number 2021A1515012638, the Basic Research Program of Guangzhou City of China, grant number 202002030320.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are openly available in [A Novel Seam Tracking Technique with A Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam—research data] at [https://cloud.huawei.com/home#/collection/v2/all] (accessed on 15 April 2021).

Acknowledgments

The authors would like to express their thanks to the Guangzhou Institute of Advanced Technology, Chinese Academy of Sciences, for helping them with the experimental characterization.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Amruta Rout, B.B.V.L.; Deepak, B.B. Advances in weld seam tracking techniques for robotic welding: A review. Robot. Comput. Integr. Manuf. 2019, 56, 12–37. [Google Scholar] [CrossRef]
  2. Pires, A.J.N.; Loureiro, T.; Godinho, P.; Ferreira, B.; Fernando, J.M. Welding robots. IEEE Robot. Autom. Mag. 2003, 10, 45–55. [Google Scholar] [CrossRef]
  3. Shao, W.J.; Huang, Y.; Zhang, Y. A novel weld seam detection method for space weld seam of narrow butt joint in laser welding. Opt. Laser Technol. 2018, 99, 39–51. [Google Scholar] [CrossRef]
  4. Pires, A.J.N.; Loureiro, G.B. Welding Robots: Technology, System Issues and Application; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  5. Lei, T.; Rong, Y.M.; Wang, H.; Huang, Y.; Li, M. A review of vision-aided robotic welding. Comput. Ind. 2020, 123, 103326–103355. [Google Scholar] [CrossRef]
  6. Hong, L.; Xiaoqi, C. Laser visual sensing for seam tracking in robotic arc welding of titanium alloys. Int. J. Adv. Manuf. Technol. 2005, 26, 1012–1017. [Google Scholar]
  7. Peiquan, X.; Guoxiang, X.; Xinhua, T.; Shun, Y. A visual seam tracking system for robotic arc welding. Int. J. Adv. Manuf. Technol. 2008, 37, 70–75. [Google Scholar]
  8. Shi, F.; Tao, L.; Chen, S. Efficient weld seam detection for robotic welding based on local image processing. Ind. Robot. Int. J. 2009, 56, 277–283. [Google Scholar] [CrossRef]
  9. Mikael, F.; Gunnar, B. Design and validation of a universal 6d seam tracking system in robotic welding based on laser scanning. Ind. Robot. Int. J. 2003, 30, 437–448. [Google Scholar]
  10. Wu, Q.-Q.; Lee, J.-P.; Park, M.-H.; Park, C.-K.; Kim, I.-S. A study on development of optimal noise filter algorithm for laser vision system in GMA welding. Procedia Eng. 2014, 97, 819–827. [Google Scholar] [CrossRef] [Green Version]
  11. Jeong, S.-K.; Lee, G.-Y.; Lee, W.-K.; Kim, S.-B. Development of high speed rotating arc sensor and seam tracking controller for welding robots. In Proceedings of the 2001 IEEE International Symposium on Industrial Electronics, (Cat. No.01TH8570), Pusan, Korea, 12–16 June 2001; pp. 845–850. [Google Scholar]
  12. Ushio, M.; Mao, W. Modelling of an arc sensor for dc mig/mag welding in open arc mode: Study of improvement of sensitivity and reliability of arc sensors in GMA welding. Weld. Int. 1996, 10, 622–631. [Google Scholar] [CrossRef]
  13. You, B.-H.; Kim, J.-W. A study on an automatic seam tracking system by using an electromagnetic sensor for sheet metal arc welding of butt joints, In: Proceedings of the institution of mechanical engineers. Part B J. Eng. Manuf. 2002, 216, 911–920. [Google Scholar] [CrossRef]
  14. Kang-Yul, B.; Jin-Hyun, P. A study on development of inductive sensor for automatic weld seam tracking. J. Mater. Process. Technol. 2006, 176, 111–116. [Google Scholar]
  15. Freire, B.T.; Miguel, M.J.; Leopoldo, C.; Ramdn, C. Weld seams detection and recognition for robotic arc-welding through ultrasonic sensors. In Proceedings of the 1994 IEEE International Symposium on Industrial Electronics (ISIE’94), Santiago, Chile, 25–27 May 1994; pp. 310–315. [Google Scholar]
  16. Maqueira, B.; Umeagukwu, C.I.; Jarzynski, J. Application of ultrasonic sensors to robotic seam tracking. IEEE Trans. Robot. Autom. 1989, 5, 337–344. [Google Scholar] [CrossRef]
  17. He, Y.; Chen, Y.; Xu, Y.; Huang, Y.; Chen, S. Autonomous detection of weld seam profiles via a model of saliency-based visual attention for robotic arc welding. J. Intell. Robot. Syst. 2016, 81, 395–402. [Google Scholar] [CrossRef]
  18. Guo, J.C.; Zhu, Z.; Yu, Y.; Sun, B. Research and Application of Visual Sensing Technology Based on Laser Structured Light in Welding Industry. Chin. J. Lasers 2017, 44, 7–16. [Google Scholar]
  19. Hou, Z.; Xu, Y.L.; Xiao, R.Q.; Chen, S.B. A teaching-free welding method based on laser visual sensing system in robotic GMAW. Int. J. Adv. Manuf. Technol. 2020, 109, 1755–1774. [Google Scholar] [CrossRef]
  20. Zou, Y.B.; Wang, Y.B.; Zhou, W.L. Research on Line Laser Seam Tracking Method based on Guassian Kernelized Correlation Filters. Appl. Laser 2016, 36, 578–584. [Google Scholar]
  21. Chen, X.H.; Dharmawan, A.G.; Foong, S.H.; Soh, G.S. Seam tracking of large pipe structures for an agile robotic welding system mounted on scaffold structures. Robot. Comput. Integr. Manuf. 2018, 50, 242–255. [Google Scholar] [CrossRef]
  22. Chang, D.Y.; Son, D.H.; Lee, J.W.; Kim, T.W.; Lee, K.Y.; Kim, J.W. A new seam-tracking algorithm through characteristic-point detection for a portable welding robot. Robot. Comput. Integr. Manuf. 2012, 28, 1–13. [Google Scholar] [CrossRef]
  23. Wang, N.F.; Zhong, K.F.; Shi, X.D.; Zhang, X.M. A robust weld seam recognition method under heavy noise based on structured-light vision. Robot. Comput. Integr. Manuf. 2020, 61, 1–9. [Google Scholar] [CrossRef]
  24. Matsui, S.; Goktug, G. Slit laser sensor guided real-time seam tracking arc welding robot system for non-uniform joint gaps, Industrial Technology. Proc. IEEE Int. Conf. Ind. Technol. 2002, 1, 159–162. [Google Scholar]
  25. Olaf, C.; Jakub, J.; Suszynski, M. Programming of Industrial Robots Using the Recognition of Geometric Signs in Flexible Welding Process. Symmetry 2020, 12, 1429. [Google Scholar]
  26. Shah, H.N.M.; Sulaiman, M.; Shukor, A.Z.; Kamis, Z.; Rahman, A.A. Butt welding joints recognition and location identification by using local thresholding. Robot. Comput. Integr. Manuf. 2018, 51, 181–188. [Google Scholar] [CrossRef]
  27. Xinde, L.; Pei, L.; Omar, K.M.; Xiangheng, H.; Sam, G.S. A welding seam identification method based on cross-modal perception. Ind. Robot. Int. J. Robot. Res. Appl. 2019, 46, 453–459. [Google Scholar]
  28. Jakub, W.; Marcin, S. Optical scanner assisted robotic assembly. Assem. Autom. 2017, 37, 434–441. [Google Scholar]
  29. Marcin, S.; Jakub, W.; Jan, Z. No Clamp Robotic Assembly with Use of Point Cloud Data from Low-Cost Triangulation Scanner. Teh. Vjesn. Tech. Gaz. 2018, 25, 904–909. [Google Scholar]
  30. Zhou, G.H.; Xu, G.C.; Gu, X.P.; Liu, J.; Tian, Y.K.; Zhou, L. Simulation and experimental study on the quality evaluation of laser welds based on ultrasonic test. Int. J. Adv. Manuf. Technol. 2017, 93, 3897–3906. [Google Scholar] [CrossRef]
  31. Yang, G.W.; Yan, S.M.; Wang, Y.Z. V-Shaped Seam Tracking Based on Particle Filter with Histogram of Oriented Gradient. Chin. J. Lasers 2020, 47, 330–338. [Google Scholar]
  32. He, Y.S.; Yu, Z.H.; Li, J.; Yu, L.S.; Ma, G.H. Discerning Weld Seam Profiles from Strong Arc Background for the Robotic Automated Welding Process via Visual Attention Features. Chin. J. Mech. Eng. 2020, 33, 799–816. [Google Scholar] [CrossRef] [Green Version]
  33. Zou, Y.B.; Wang, Y.B.; Zhou, W.L.; Chen, X.Z. Real-time seam tracking control system based on line laser visions. Opt. Laser Technol. 2018, 103, 182–192. [Google Scholar] [CrossRef]
  34. Shao, W.J.; Liu, X.F.; Wu, Z.J. A robust weld seam detection method based on particle filter for laser welding by using a passive vision sensor. Int. J. Adv. Manuf. Technol. 2019, 104, 2971–2980. [Google Scholar] [CrossRef]
  35. Chen, W.J. Research on Seam Track Measuring System Based on Stripe Type Laser Sensor [Dissertation]; South China University of Technology: Guangzhou, China, 2018. [Google Scholar]
  36. Wang, X.Y.; Zhu, Z.M.; Zhou, F.Q.; Zhang, F.M. Complete calibration of a structured light stripe vision sensor through a single cylindrical target. Opt. Lasers Eng. 2020, 131, 106096. [Google Scholar] [CrossRef]
  37. Li, L.; Lin, B.Q.; Zou, Y.B. Study on Seam Tracking System Based on Stripe Type Laser Sensor and Welding Robot. Chin. J. Lasers 2015, 42, 34–41. [Google Scholar]
  38. Kidong, L.; Insung, H.; Young-Min, K.; Huijun, L.; Munjin, K.; Jiyoung, Y. Real-Time Weld Quality Prediction Using a Laser Vision Sensor in a Lap Fillet Joint during Gas Metal Arc Welding. Sensors 2020, 20, 1625–1641. [Google Scholar]
  39. Jawad, M.; Halis, A.; Essam, A. Welding seam profiling techniques based on active vision sensing for intelligent robotic welding. Int. J. Adv. Manuf. Technol. 2017, 88, 127–145. [Google Scholar]
  40. Qiao, G.F.; Sun, D.L.; Song, G.M. A Rapid Coordinate Transformation Method for Serial Robot Calibration System. J. Mech. Eng. 2020, 56, 1–8. [Google Scholar]
  41. Lei, T.; Huang, Y.; Wang, H.; Rong, Y.M. Automatic weld seam tracking of tube-to-tube sheet TIG welding robot with multiple sensors. J. Manuf. Process. 2020, 3, 47–52. [Google Scholar]
  42. Kovacevic, R.; Zhang, S.B.; Zhang, M.Y. Noncontact Ultrasonic Sensing for Seam Tracking in Arc Welding Processes. J. Manuf. Sci. Eng. 1998, 120, 600–608. [Google Scholar]
Figure 1. Diagram of seam tracking system.
Figure 1. Diagram of seam tracking system.
Sensors 21 03067 g001
Figure 2. Flowchart of the four-step method for (a) scanning; (b) filtering; (c) feature points extracting; and (d) path planning.
Figure 2. Flowchart of the four-step method for (a) scanning; (b) filtering; (c) feature points extracting; and (d) path planning.
Sensors 21 03067 g002
Figure 3. Classification of groove feature points.
Figure 3. Classification of groove feature points.
Sensors 21 03067 g003
Figure 4. Initial positioning of feature points for (a) the first type of feature points; and (b) all feature points.
Figure 4. Initial positioning of feature points for (a) the first type of feature points; and (b) all feature points.
Sensors 21 03067 g004
Figure 5. Defects on the surface of the weldment.
Figure 5. Defects on the surface of the weldment.
Sensors 21 03067 g005
Figure 6. Fitting results.
Figure 6. Fitting results.
Sensors 21 03067 g006
Figure 7. Relationship between two coordinate systems.
Figure 7. Relationship between two coordinate systems.
Sensors 21 03067 g007
Figure 8. Laser sensor calibration for (a) base coordinates; and (b) sensor coordinates.
Figure 8. Laser sensor calibration for (a) base coordinates; and (b) sensor coordinates.
Sensors 21 03067 g008
Figure 9. A prototype of the experimental system.
Figure 9. A prototype of the experimental system.
Sensors 21 03067 g009
Figure 10. Two typical welding grooves for (a) straight line; and (b) curve.
Figure 10. Two typical welding grooves for (a) straight line; and (b) curve.
Sensors 21 03067 g010
Figure 11. Two typical welding grooves scanned by laser sensor: (a) straight line; (b) curve.
Figure 11. Two typical welding grooves scanned by laser sensor: (a) straight line; (b) curve.
Sensors 21 03067 g011
Figure 12. Experimental results of (a) straight line with initial positioning; (b) straight line with precise positioning; (c) curve with initial positioning; and (d) curve with precise positioning.
Figure 12. Experimental results of (a) straight line with initial positioning; (b) straight line with precise positioning; (c) curve with initial positioning; and (d) curve with precise positioning.
Sensors 21 03067 g012
Table 1. Properties of discontinuous points of function.
Table 1. Properties of discontinuous points of function.
Discontinuous Points TypeAmplitudeFirst DerivativeSecond Derivative
The firstcontinuityStep mutationextremum
The secondcontinuitynon-existent/
Table 2. Results of initial positioning.
Table 2. Results of initial positioning.
Feature PointsABCDEF
X/mm−5.67−3.37−3.020.721.113.59
Z/mm−1.352.896.036.013.15−1.02
Table 3. Parameters of fitting straight line.
Table 3. Parameters of fitting straight line.
Fitting Straight Line1234567
SSE0.080.440.390.150.500.150.21
R-squared0.850.990.950.870.970.990.81
Table 4. Results of precise positioning.
Table 4. Results of precise positioning.
Feature PointsABCDEF
X/mm−5.73−3.31−3.040.781.103.76
Z/mm−1.393.075..985..993.22−1.18
Table 5. Groove parameters of weldment.
Table 5. Groove parameters of weldment.
Welding TypeDimension/mmThickness/mmSlope Angle/°Blunt Edge/mm
Straight line100 × 608452.5
Curve130 × 7010603
Table 6. Error analysis results.
Table 6. Error analysis results.
Welding TypeInitial PositioningPrecise Positioning
dx/mmdz/mmpx/%pz/%dx/mmdz/mmpx/%pz/%
Straight line0.6280.2146.6882.6650.3870.2304.1212.864
Curve0.7360.1857.8382.3040.4290.2514.5693.126
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, G.; Zhang, Y.; Tuo, S.; Hou, Z.; Yang, W.; Xu, Z.; Wu, Y.; Yuan, H.; Shin, K. A Novel Seam Tracking Technique with a Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam. Sensors 2021, 21, 3067. https://doi.org/10.3390/s21093067

AMA Style

Zhang G, Zhang Y, Tuo S, Hou Z, Yang W, Xu Z, Wu Y, Yuan H, Shin K. A Novel Seam Tracking Technique with a Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam. Sensors. 2021; 21(9):3067. https://doi.org/10.3390/s21093067

Chicago/Turabian Style

Zhang, Gong, Yuhang Zhang, Shuaihua Tuo, Zhicheng Hou, Wenlin Yang, Zheng Xu, Yueyu Wu, Hai Yuan, and Kyoosik Shin. 2021. "A Novel Seam Tracking Technique with a Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam" Sensors 21, no. 9: 3067. https://doi.org/10.3390/s21093067

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop