Search results
1 – 10 of over 1000Yanling Xu, Huanwei Yu, Jiyong Zhong, Tao Lin and Shanben Chen
The purpose of this paper is to analyze the technology of capturing and processing weld images in real‐time, which is very important to the seam tracking and the weld quality…
Abstract
Purpose
The purpose of this paper is to analyze the technology of capturing and processing weld images in real‐time, which is very important to the seam tracking and the weld quality control during the robotic gas tungsten arc welding (GTAW) process.
Design/methodology/approach
By analyzing some main parameters on the effect of image capturing, a passive vision sensor for welding robot was designed in order to capture clear and steady welding images. Based on the analysis of the characteristic of the welding images, a new improved Canny algorithm was proposed to detect the edges of seam and pool, and extract the seam and pool characteristic parameters. Finally, the image processing precision was verified by the random welding experiments.
Findings
It was found that the seam and pool images can be clearly acquired by using the passive vision system, and the welding image characteristic parameters were accurately extracted through processing. The experiment results show that the precision range of the image processing can be controlled about within ±0.169 mm, which can completely meet the requirement of real‐time seam tracking for welding robot.
Research limitations/implications
This system will be applied to the industrial welding robot production during the GTAW process.
Originality/value
It is very important for the type of teaching‐playback robots with the passive vision that the real‐time images of seam and pool are acquired clearly and processed accurately during the robotic welding process, which helps determine follow‐up seam track and the control of welding quality.
Details
Keywords
Chengdong Yang, Zhen Ye, Yuxi Chen, Jiyong Zhong and Shanben Chen
This paper aims to solve the problem that the changing of groove size and assembly gap would affect the precision of the multi-pass path planning and the welding quality and…
Abstract
Purpose
This paper aims to solve the problem that the changing of groove size and assembly gap would affect the precision of the multi-pass path planning and the welding quality and realize the automatic welding of a thick plate.
Design/methodology/approach
First, a double-sided double arc welding (DSAW) system with a self-designed passive vision sensor was established, then the image of the groove was captured and the characteristic parameters of groove were extracted by image processing. According to the welding parameters and the extracted geometry size, multi-pass path planning was executed by the DSAW system.
Findings
A DSAW system with a self-designed passive vision sensor was established which can realize the welding thick plate by double-sided double arc by two robots. The clear welding image of the groove was acquired, and an available image processing algorithm was proposed to accurately extract the characteristic parameters of the groove. According to the welding parameters and the extracted geometry size, multi-pass path planning can be executed by the DSAW system automatically.
Originality/value
Gas metal arc welding is used for root welding and filler passes in DSAW. Multi-pass path planning for thick plate by Double-sided Double Arc Welding (DSAW) based on vision sensor was proposed.
Details
Keywords
Wang Zhang, Lizhe Fan, Yanbin Guo, Weihua Liu and Chao Ding
The purpose of this study is to establish a method for accurately extracting torch and seam features. This will improve the quality of narrow gap welding. An adaptive deflection…
Abstract
Purpose
The purpose of this study is to establish a method for accurately extracting torch and seam features. This will improve the quality of narrow gap welding. An adaptive deflection correction system based on passive light vision sensors was designed using the Halcon software from MVtec Germany as a platform.
Design/methodology/approach
This paper proposes an adaptive correction system for welding guns and seams divided into image calibration and feature extraction. In the image calibration method, the field of view distortion because of the position of the camera is resolved using image calibration techniques. In the feature extraction method, clear features of the weld gun and weld seam are accurately extracted after processing using algorithms such as impact filtering, subpixel (XLD), Gaussian Laplacian and sense region for the weld gun and weld seam. The gun and weld seam centers are accurately fitted using least squares. After calculating the deviation values, the error values are monitored, and error correction is achieved by programmable logic controller (PLC) control. Finally, experimental verification and analysis of the tracking errors are carried out.
Findings
The results show that the system achieves great results in dealing with camera aberrations. Weld gun features can be effectively and accurately identified. The difference between a scratch and a weld is effectively distinguished. The system accurately detects the center features of the torch and weld and controls the correction error to within 0.3mm.
Originality/value
An adaptive correction system based on a passive light vision sensor is designed which corrects the field-of-view distortion caused by the camera’s position deviation. Differences in features between scratches and welds are distinguished, and image features are effectively extracted. The final system weld error is controlled to 0.3 mm.
Details
Keywords
Shanchun Wei, Meng Kong, Tao Lin and Shanben Chen
This paper aims to develop a method to achieve automatic robotic welding and seam tracking so that three‐dimensional weld seam could be tracked without teaching and good welding…
Abstract
Purpose
This paper aims to develop a method to achieve automatic robotic welding and seam tracking so that three‐dimensional weld seam could be tracked without teaching and good welding formation could be accomplished.
Design/methodology/approach
Adaptive image processing method was used for various types of weld seam. Also the relationship between welding height and arc signal was calibrated. Through the decomposition and synthesis, three‐dimensional space type weld seam could be extracted and tracked well. The workpiece without teaching was finally tracked precisely and in a timely way with use of the fuzzy controller.
Findings
Composite sensing technology including arc and visual sensing had obvious advantages. Image processing method could be used for tracking plane weld seam efficiently while arc sensing could characterize welding height. Through the coupled controlling algorithm, arc sensing and visual sensing could be fused effectively.
Research limitations/implications
How to couple information more accurately and quickly was still one of the most important problems in composite sensing technology.
Practical implications
Composite sensing technology could reduce costs to achieve weld seam instead such expensive device as laser sensor. The simulating parts of scalloped segment of bottom board for rockets were tracked in the project. Once more adaptive algorithms were developed, more complicated practical workpieces could be dealt with in robotic welding which promotes the application of industry robots.
Originality/value
A useful method for three‐dimensional space type weld seam tracking without teaching was developed. The whole procedure of adaptive image processing method was simple but efficient and robust. The coupled controlling strategy addressed could accomplish seam tracking by composite sensing technology.
Details
Keywords
Li Juan Yang, Pei Huang Lou and Xiao Ming Qian
The main purpose of this paper is to develop a method to recognize the initial welding position for large-diameter pipeline automatically, and introduce the image processing based…
Abstract
Purpose
The main purpose of this paper is to develop a method to recognize the initial welding position for large-diameter pipeline automatically, and introduce the image processing based on pulse-coupled neural network (PCNN) which is adopted by the proposed method.
Design/methodology/approach
In this paper, a passive vision sensor is designed to capture weld seam images in real time. The proposed method contains two steps. The first step is to detect the rough position of the weld seam, and the second step is to recognize one of the solder joints from the local image and extract its centroid, which is regarded as the initial welding position. In each step, image segmentation and removal of small false regions based on PCNN are adopted to obtain the object regions; then, the traditional image processing theory is used for the subsequent processing.
Findings
The experimental results show the feasibility and real time of the proposed method. Based on vision sensing technology and PCNN, it is able to achieve the autonomous recognition of initial welding position in large-diameter pipeline welding.
Practical implications
The proposed method can greatly shorten the time of positioning the initial welding position and satisfy the automatic welding for large-diameter pipeline.
Originality/value
In the proposed method, the image pre-processing is based on PCNN, which is more robust and flexible in the complex welding environment. After that, traditional image processing theory is adopted for the subsequent processing, of which the processing speed is faster.
Details
Keywords
Chetan Jalendra, B.K. Rout and Amol Marathe
Industrial robots are extensively deployed to perform repetitive and simple tasks at high speed to reduce production time and improve productivity. In most cases, a compliant…
Abstract
Purpose
Industrial robots are extensively deployed to perform repetitive and simple tasks at high speed to reduce production time and improve productivity. In most cases, a compliant gripper is used for assembly tasks such as peg-in-hole assembly. A compliant mechanism in the gripper introduces flexibility that may cause oscillation in the grasped object. Such a flexible gripper–object system can be considered as an under-actuated object held by the gripper and the oscillations can be attributed to transient disturbance of the robot itself. The commercially available robots do not have a control mechanism to reduce such induced vibration. Thus, this paper aims to propose a contactless vision-based approach for vibration suppression which uses a predictive vibrational amplitude error-based second-stage controller.
Design/methodology/approach
The proposed predictive vibrational amplitude error-based second-stage controller is a real-time vibration control strategy that uses predicted error to estimate the second-stage controller output. Based on controller output, input trajectories were estimated for the internal controller of the robot. The control strategy efficiently handles the system delay to execute the control input trajectories when the oscillating object is at an extreme position.
Findings
The present controller works along with the internal controller of the robot without any interruption to suppress the residual vibration of the object. To demonstrate the robustness of the proposed controller, experimental implementation on Asea Brown Boveri make industrial robot (IRB) 1410 robot with a low frame rate camera has been carried out. In this experiment, two objects have been considered that have a low (<2.38 Hz) and high (>2.38 Hz) natural frequency. The proposed controller can suppress 95% of vibration amplitude in less than 3 s and reduce the stability time by 90% for a peg-in-hole assembly task.
Originality/value
The present vibration control strategy uses a camera with a low frame rate (25 fps) and the delays are handled intelligently to favour suppression of high-frequency vibration. The mathematical model and the second-stage controller implemented suppress vibration without modifying the robot dynamical model and the internal controller.
Details
Keywords
Chen Chen, Tingyang Chen, Zhenhua Cai, Chunnian Zeng and Xiaoyue Jin
The traditional vision system cannot automatically adjust the feature point extraction method according to the type of welding seam. In addition, the robot cannot self-correct the…
Abstract
Purpose
The traditional vision system cannot automatically adjust the feature point extraction method according to the type of welding seam. In addition, the robot cannot self-correct the laying position error or machining error. To solve this problem, this paper aims to propose a hierarchical visual model to achieve automatic arc welding guidance.
Design/methodology/approach
The hierarchical visual model proposed in this paper is divided into two layers: welding seam classification layer and feature point extraction layer. In the welding seam classification layer, the SegNet network model is trained to identify the welding seam type, and the prediction mask is obtained to segment the corresponding point clouds. In the feature point extraction layer, the scanning path is determined by the point cloud obtained from the upper layer to correct laying position error. The feature points extraction method is automatically determined to correct machining error based on the type of welding seam. Furthermore, the corresponding specific method to extract the feature points for each type of welding seam is proposed. The proposed visual model is experimentally validated, and the feature points extraction results as well as seam tracking error are finally analyzed.
Findings
The experimental results show that the algorithm can well accomplish welding seam classification, feature points extraction and seam tracking with high precision. The prediction mask accuracy is above 90% for three types of welding seam. The proposed feature points extraction method for each type of welding seam can achieve sub-pixel feature extraction. For the three types of welding seam, the maximum seam tracking error is 0.33–0.41 mm, and the average seam tracking error is 0.11–0.22 mm.
Originality/value
The main innovation of this paper is that a hierarchical visual model for robotic arc welding is proposed, which is suitable for various types of welding seam. The proposed visual model well achieves welding seam classification, feature point extraction and error correction, which improves the automation level of robot welding.
Details
Keywords
Abstract
Purpose
The control of weld penetration in gas tungsten arc welding (GTAW) is required for a “teach and playback” robot to overcome the gap variation in the welding process. This paper aims to investigate this subject.
Design/methodology/approach
This paper presents a robotic system based on the real‐time vision measurement. The primary objective has been to demonstrate the feasibility of using vision‐based image processing to measure the seam gap in real‐time and adjust welding current and wire‐feed rate to realize the penetration control during the robot‐welding process.
Findings
The paper finds that vision‐based measurement of the seam gap can be used in the welding robot, in real‐time, to control weld penetration. It helps the “teach and playback” robot to adjust the welding procedures according to the gap variation.
Research limitations/implications
The system requires that the seam edges can be accurately identified using a correlation method.
Practical implications
The system is applicable to storage tank welding of a rocket.
Originality/value
The control algorithm based on the knowledge base has been set up for continuous GTAW. A novel visual image analysis method has been developed in the study for a welding robot.
Details
Keywords
This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).
Abstract
Purpose
This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).
Design/methodology/approach
The conversion relationship between the pixel coordinate system and laser plane coordinate system is established on the basis of the mathematical model of three-dimensional measurement of laser vision sensor. In addition, the conversion relationship between the arc welding robot coordinate system and the laser vision sensor measurement coordinate system is also established on the basis of the hand–eye calibration model. The ordinary least square (OLS) is used to calculate the rotation matrix, and the SDP is used to identify the direction vectors of the rotation matrix to ensure their orthogonality.
Findings
The feasibility identification can reduce the calibration error, and ensure the orthogonality of the calibration results. More accurate calibration results can be obtained by combining OLS + SDP.
Originality/value
A set of advanced calibration methods is systematically established, which includes parameters calibration of laser vision sensor and hand–eye calibration of robots and sensors. For the hand–eye calibration, the physics feasibility problem of rotating matrix is creatively put forward, and is solved through SDP algorithm. High-precision calibration results provide a good foundation for future research on seam tracking.
Details
Keywords
Jian Le, Hua Zhang and Jin-wen Li
This study aims to improve the welding quality and efficiency, and an algorithm should be designed to realize tracking space-curved fillet weld joints.
Abstract
Purpose
This study aims to improve the welding quality and efficiency, and an algorithm should be designed to realize tracking space-curved fillet weld joints.
Design/methodology/approach
Fillet weld joints tracking based on the two wheels and the horizontal slider coordinated movement has been studied. The method of pattern recognition is used to identify the height deviation, and the analysis of the accuracy corresponding to recognizing height deviations has been researched. The proportional control algorithm is used to control the vertical and horizontal sliders movement, so fillet weld joints tracking in the height direction has been achieved. Based on wheels and vertical and horizontal sliders coordinated movement, the algorithm of space-curved fillet weld joints tracking has been researched.
Findings
Some experiments have been done, and experimental results show that the welding robot can track space-curved fillet weld joints with high accuracy and good reliability.
Research limitations/implications
The welding robot can improve the welding quality and efficiency.
Practical implications
The welding robot can track fillet weld joints in ship panels, and it was shown that the welding robot could track space-curved fillet weld joints with high accuracy and good reliability.
Social implications
The welding robot has many industrial and social applications.
Originality/value
There are various forms of fillet weld joints in the industry, and the fillet weld is curved in the space. Experimental results show that the welding robot can track space-curved fillet weld joints with good stability and high precision.
Details