Search results

1 – 10 of 79
Article
Publication date: 8 October 2018

Yanbiao Zou and Xiangzhi Chen

This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).

Abstract

Purpose

This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).

Design/methodology/approach

The conversion relationship between the pixel coordinate system and laser plane coordinate system is established on the basis of the mathematical model of three-dimensional measurement of laser vision sensor. In addition, the conversion relationship between the arc welding robot coordinate system and the laser vision sensor measurement coordinate system is also established on the basis of the hand–eye calibration model. The ordinary least square (OLS) is used to calculate the rotation matrix, and the SDP is used to identify the direction vectors of the rotation matrix to ensure their orthogonality.

Findings

The feasibility identification can reduce the calibration error, and ensure the orthogonality of the calibration results. More accurate calibration results can be obtained by combining OLS + SDP.

Originality/value

A set of advanced calibration methods is systematically established, which includes parameters calibration of laser vision sensor and hand–eye calibration of robots and sensors. For the hand–eye calibration, the physics feasibility problem of rotating matrix is creatively put forward, and is solved through SDP algorithm. High-precision calibration results provide a good foundation for future research on seam tracking.

Details

Industrial Robot: An International Journal, vol. 45 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 19 January 2015

Haixia Wang, Xiao Lu, Zhanyi Hu and Yuxia Li

The purpose of this paper is to present a fully automatic calibration method for hand-eye serial robot system is presented in this paper. The so-called “fully automatic” is meant…

Abstract

Purpose

The purpose of this paper is to present a fully automatic calibration method for hand-eye serial robot system is presented in this paper. The so-called “fully automatic” is meant to calibrate the robot body, the hand-eye relation, and the used measuring binocular system at the same time.

Design/methodology/approach

The calibration is done by controlling the joints to rotate several times one by one in the reverse order (i.e. from the last one to the first one), and simultaneously take pictures of the checkerboard patterns by the stereo camera system attached on the end-effector, then the whole robot system can be calibrated automatically from these captured images. In addition, a nonlinear optimization step is used to further refine the calibration results.

Findings

The proposed method is essentially based on an improved screw axis identification method, and it needs only a mirror and some paper checkerboard patterns without resorting to any additional costly measuring instrument.

Originality/value

Simulations and real experiments on MOTOMAN-UP6 robot system demonstrate the feasibility and effectiveness of the proposed method.

Details

Industrial Robot: An International Journal, vol. 42 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 13 December 2017

Cengiz Deniz and Mustafa Cakir

This paper aims to introduce a simple hand-eye calibration method that can be easily applied with different objective functions.

364

Abstract

Purpose

This paper aims to introduce a simple hand-eye calibration method that can be easily applied with different objective functions.

Design/methodology/approach

The hand-eye calibration is solved by using the closed form absolute orientation equations. Instead of processing all samples together, the proposed method goes through all minimal solution sets. Final result is chosen after evaluating the solution set for arbitrary objectives. In this stage, outliers can be excluded optionally if more accuracy is desired.

Findings

The proposed method is very flexible and gives more accurate and convenient results than the existing solutions. The mathematical error expression defined by the calibration equations may not be valid in practice, where especially systematic distortions are present. It is shown in the simulations that the solution which results the least mathematical error in systems may have incorrect, incompatible results in the presence of practical demands.

Research limitations/implications

The performance of the calibration performed with the proposed method is compared with the reference methods in the literature. When the back-projection error is benchmarked, which corresponds to the point repeatability, the proposed approach is considered as the most successful method among all others. Due to its robustness, it is decided to make tooling-sensor calibrations by the recommended method, in the robotic non-destructive testing station in Ford-OTOSAN Kocaeli Plant Body Shop Department.

Originality/value

Arranging the well-known AX = XB calibration equation in quaternion representation as Q_A = Q_x × Q_B × Q_x reveals another common spatial rotation equation. In this way, absolute orientation solution satisfies the hand-eye calibration equations. The proposed solution is not presented in the literature as a standalone hand-eye calibration method, although some researchers drop a hint to the relative formulations.

Details

Industrial Robot: An International Journal, vol. 45 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 31 May 2023

Xu Jingbo, Li Qiaowei and White Bai

The purpose of this study is solving the hand–eye calibration issue for line structured light vision sensor. Only after hand–eye calibration the sensor measurement data can be…

Abstract

Purpose

The purpose of this study is solving the hand–eye calibration issue for line structured light vision sensor. Only after hand–eye calibration the sensor measurement data can be applied to robot system.

Design/methodology/approach

In this paper, the hand–eye calibration methods are studied, respectively, for eye-in-hand and eye-to-hand. Firstly, the coordinates of the target point in robot system are obtained by tool centre point (TCP), then the robot is controlled to make the sensor measure the target point in multiple poses and the measurement data and pose data are obtained; finally, the sum of squared calibration errors is minimized by the least square method. Furthermore, the missing vector in the process of solving the transformation matrix is obtained by vector operation, and the complete matrix is obtained.

Findings

On this basis, the sensor measurement data can be easily and accurately converted to the robot coordinate system by matrix operation.

Originality/value

This method has no special requirement for robot pose control, and its calibration process is fast and efficient, with high precision and has practical popularized value.

Details

Sensor Review, vol. 43 no. 4
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 9 April 2021

Jinlei Zhuang, Ruifeng Li, Chuqing Cao, Yunfeng Gao, Ke Wang and Feiyang Wang

This paper aims to propose a measurement principle and a calibration method of measurement system integrated with serial robot and 3D camera to identify its parameters…

Abstract

Purpose

This paper aims to propose a measurement principle and a calibration method of measurement system integrated with serial robot and 3D camera to identify its parameters conveniently and achieve high measurement accuracy.

Design/methodology/approach

A stiffness and kinematic measurement principle of the integrated system is proposed, which considers the influence of robot weight and load weight on measurement accuracy. Then an error model is derived based on the principle that the coordinate of sphere center is invariant, which can simultaneously identify the parameters of joint stiffness, kinematic and hand-eye relationship. Further, considering the errors of the parameters to be calibrated and the measurement error of 3D camera, a method to generate calibration observation data is proposed to validate both calibration accuracy and parameter identification accuracy of calibration method.

Findings

Comparative simulations and experiments of conventional kinematic calibration method and the stiffness and kinematic calibration method proposed in this paper are conducted. The results of the simulations show that the proposed method is more accurate, and the identified values of angle parameters in modified Denavit and Hartenberg model are closer to their real values. Compared with the conventional calibration method in experiments, the proposed method decreases the maximum and mean errors by 19.9% and 13.4%, respectively.

Originality/value

A new measurement principle and a novel calibration method are proposed. The proposed method can simultaneously identify joint stiffness, kinematic and hand-eye parameters and obtain not only higher measurement accuracy but also higher parameter identification accuracy, which is suitable for on-site calibration.

Details

Industrial Robot: the international journal of robotics research and application, vol. 48 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 20 October 2014

Hui Pan, Na Li Wang and Yin Shi Qin

The purpose of this paper is to propose a method that calibrates the hand-eye relationship for eye-to-hand configuration and afterwards a rectification to improve the accuracy of…

Abstract

Purpose

The purpose of this paper is to propose a method that calibrates the hand-eye relationship for eye-to-hand configuration and afterwards a rectification to improve the accuracy of general calibration.

Design/methodology/approach

The hand-eye calibration of eye-to-hand configuration is summarized as a equation AX = XB which is the same as in eye-in-hand calibration. A closed-form solution is derived. To abate the impact of noise, a rectification is conducted after the general calibration.

Findings

Simulation and actual experiments confirm that the accuracy of calibration is obviously improved.

Originality/value

Only a calibration plane is required for the hand-eye calibration. Taking the impact of noise into account, a rectification is carried out after the general calibration and, as a result, that the accuracy is obviously improved. The method can be applied in many actual applications.

Details

Industrial Robot: An International Journal, vol. 41 no. 6
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 12 May 2020

Jing Bai, Yuchang Zhang, Xiansheng Qin, Zhanxi Wang and Chen Zheng

The purpose of this paper is to present a visual detection approach to predict the poses of target objects placed in arbitrary positions before completing the corresponding tasks…

Abstract

Purpose

The purpose of this paper is to present a visual detection approach to predict the poses of target objects placed in arbitrary positions before completing the corresponding tasks in mobile robotic manufacturing systems.

Design/methodology/approach

A hybrid visual detection approach that combines monocular vision and laser ranging is proposed based on an eye-in-hand vision system. The laser displacement sensor is adopted to achieve normal alignment for an arbitrary plane and obtain depth information. The monocular camera measures the two-dimensional image information. In addition, a robot hand-eye relationship calibration method is presented in this paper.

Findings

First, a hybrid visual detection approach for mobile robotic manufacturing systems is proposed. This detection approach is based on an eye-in-hand vision system consisting of one monocular camera and three laser displacement sensors and it can achieve normal alignment for an arbitrary plane and spatial positioning of the workpiece. Second, based on this vision system, a robot hand-eye relationship calibration method is presented and it was successfully applied to a mobile robotic manufacturing system designed by the authors’ team. As a result, the relationship between the workpiece coordinate system and the end-effector coordinate system could be established accurately.

Practical implications

This approach can quickly and accurately establish the relationship between the coordinate system of the workpiece and that of the end-effector. The normal alignment accuracy of the hand-eye vision system was less than 0.5° and the spatial positioning accuracy could reach 0.5 mm.

Originality/value

This approach can achieve normal alignment for arbitrary planes and spatial positioning of the workpiece and it can quickly establish the pose relationship between the workpiece and end-effector coordinate systems. Moreover, the proposed approach can significantly improve the work efficiency, flexibility and intelligence of mobile robotic manufacturing systems.

Details

Industrial Robot: the international journal of robotics research and application, vol. 47 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 2 March 2012

Jwu‐Sheng Hu and Yung‐Jung Chang

The purpose of this paper is to propose a calibration method that can calibrate the relationships among the robot manipulator, the camera and the workspace.

Abstract

Purpose

The purpose of this paper is to propose a calibration method that can calibrate the relationships among the robot manipulator, the camera and the workspace.

Design/methodology/approach

The method uses a laser pointer rigidly mounted on the manipulator and projects the laser beam on the work plane. Nonlinear constraints governing the relationships of the geometrical parameters and measurement data are derived. The uniqueness of the solution is guaranteed when the camera is calibrated in advance. As a result, a decoupled multi‐stage closed‐form solution can be derived based on parallel line constraints, line/plane intersection and projective geometry. The closed‐form solution can be further refined by nonlinear optimization which considers all parameters simultaneously in the nonlinear model.

Findings

Computer simulations and experimental tests using actual data confirm the effectiveness of the proposed calibration method and illustrate its ability to work even when the eye cannot see the hand.

Originality/value

Only a laser pointer is required for this calibration method and this method can work without any manual measurement. In addition, this method can also be applied when the robot is not within the camera field of view.

Details

Industrial Robot: An International Journal, vol. 39 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 20 December 2021

Ruolong Qi and Wenfeng Liang

Nuclear waste tanks need to be cut into pieces before they can be safely disposed of, but the cutting process produces a large amount of aerosols with radiation, which is very…

Abstract

Purpose

Nuclear waste tanks need to be cut into pieces before they can be safely disposed of, but the cutting process produces a large amount of aerosols with radiation, which is very harmful to the health of the operator. The purpose of this paper is to establish an intelligent strategy for an integrated robot designed for measurement and cutting, which can accurately identify and cut unknown nuclear waste tanks and realize autonomous precise processing.

Design/methodology/approach

A robot system integrating point cloud measurement and plasma cutting is designed in this paper. First, accurate calibration methods for the robot, tool and hand-eye system are established. Second, for eliminating the extremely scattered point cloud caused by metal surface refraction, an omnidirectional octree data structure with 26 vectors is proposed to extract the point cloud model more accurately. Then, a minimum bounding box is calculated for limiting the local area to be cut, the local three-dimensional shape of the nuclear tank is fitted within the bounding box, in which the cutting trajectories and normal vectors are planned accurately.

Findings

The cutting precision is verified by changing the tool into a dial indicator in the simulation and the experiment process. The octree data structure with omnidirectional pointing vectors can effectively improve the filtering accuracy of the scattered point cloud. The point cloud filter algorithm combined with the structure calibration methods for the integrated measurement and processing system can ensure the final machining accuracy of the robot.

Originality/value

Aiming at the problems of large measurement noise interference, complex transformations between coordinate systems and difficult accuracy guarantee, this paper proposes structure calibration, point cloud filtering and point cloud-based planning algorithm, which can greatly improve the reliability and accuracy of the system. Simulation and experiment verify the final cutting accuracy of the whole system.

Details

Industrial Robot: the international journal of robotics research and application, vol. 49 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 26 January 2010

Shanchun Wei, Hongbo Ma, Tao Lin and Shanben Chen

Recognition and guidance of initial welding position (IWP) is one of the most important steps of automatic welding process, also a key technology of autonomous welding process…

Abstract

Purpose

Recognition and guidance of initial welding position (IWP) is one of the most important steps of automatic welding process, also a key technology of autonomous welding process. The purpose of this paper is to advance an improved Harris Algorithm and grey scale scanning method (GSCM) to raise the precision of image processing.

Design/methodology/approach

Through the configuration of “single camera and double positions,” a new set of image processing algorithms is adopted to extract feature points by using the pattern of rough location and subtle extraction, so as to restructure three‐dimensional information to guide robot move to IWP in the practical welding environment.

Findings

Experiments showed that mean square errors (MSEs) in X, Y, Z‐directions for both flat butt joint and flat flange are 0.4491, 0.8178, 1.4797, and 0.5398, 0.4861, 1.1071 mm, respectively.

Research limitations/implications

It has a limitation in providing guidance for only one step, and would be more accurate if fractional steps are adopted.

Practical implications

Guidance experiments of IWPs on oxidant tank's simulating parts are carried out, whose success rate is up to 95 percent and MSEs are 0.7407, 0.7971, and 1.3429 mm. It meets the demands of continuous and automatic welding process.

Originality/value

Improved Harris Algorithm and GSCM are advanced to raise the precision of image processing which influenced guidance precision most.

Details

Sensor Review, vol. 30 no. 1
Type: Research Article
ISSN: 0260-2288

Keywords

1 – 10 of 79