Search results
1 – 10 of over 8000Xiaojun Wu, Bo Liu, Peng Li and Yunhui Liu
Existing calibration methods mainly focus on the camera laser-plane calibration of a single laser-line length, which is not convenient and cannot guarantee the consistency of the…
Abstract
Purpose
Existing calibration methods mainly focus on the camera laser-plane calibration of a single laser-line length, which is not convenient and cannot guarantee the consistency of the results when several three-dimensional (3D) scanners are involved. Thus, this study aims to provide a unified step for different laser-line length calibration requirements for laser profile measurement (LPM) systems.
Design/methodology/approach
3D LPM is the process of converting physical objects into 3D digital models, wherein camera laser-plane calibration is critical for ensuring system precision. However, conventional calibration methods for 3D LPM typically use a calibration target to calibrate the system for a single laser-line length, which needs multiple calibration patterns and makes the procedure complicated. In this paper, a unified calibration method was proposed to automatically calibrate the camera laser-plane parameters for the LPM systems with different laser-line lengths. The authors designed an elaborate planar calibration target with different-sized rings that mounted on a motorized linear platform to calculate the laser-plane parameters of the LPM systems. Then, the camera coordinates of the control points are obtained using the intersection line between the laser line and the planar target. With a new proposed error correction model, the errors caused by hardware assembly can be corrected. To validate the proposed method, three LPM devices with different laser-line lengths are used to verify the proposed system. Experimental results show that the proposed method can calibrate the LPM systems with different laser-line lengths conveniently with standard steps.
Findings
The repeatability and accuracy of the proposed calibration prototypes were evaluated with high-precision workpieces. The experiments have shown that the proposed method is highly adaptive and can automatically calibrate the LPM system with different laser-line lengths with high accuracy.
Research limitations/implications
In the repeatability experiments, there were errors in the measured heights of the test workpieces, and this is because the laser emitter had the best working distance and laser-line length.
Practical implications
By using this proposed method and device, the calibration of the 3D scanning laser device can be done in an automatic way.
Social implications
The calibration efficiency of a laser camera device is increased.
Originality/value
The authors proposed a unified calibration method for LPM systems with different laser-line lengths that consist of a motorized linear joint and a calibration target with elaborately designed ring patterns; the authors realized the automatic parameter calibration.
Details
Keywords
This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).
Abstract
Purpose
This paper aims to propose a hand–eye calibration method of arc welding robot and laser vision sensor by using semidefinite programming (SDP).
Design/methodology/approach
The conversion relationship between the pixel coordinate system and laser plane coordinate system is established on the basis of the mathematical model of three-dimensional measurement of laser vision sensor. In addition, the conversion relationship between the arc welding robot coordinate system and the laser vision sensor measurement coordinate system is also established on the basis of the hand–eye calibration model. The ordinary least square (OLS) is used to calculate the rotation matrix, and the SDP is used to identify the direction vectors of the rotation matrix to ensure their orthogonality.
Findings
The feasibility identification can reduce the calibration error, and ensure the orthogonality of the calibration results. More accurate calibration results can be obtained by combining OLS + SDP.
Originality/value
A set of advanced calibration methods is systematically established, which includes parameters calibration of laser vision sensor and hand–eye calibration of robots and sensors. For the hand–eye calibration, the physics feasibility problem of rotating matrix is creatively put forward, and is solved through SDP algorithm. High-precision calibration results provide a good foundation for future research on seam tracking.
Details
Keywords
Examines the problem of the robot and fixture calibration from theperspective of simulation and off‐line programming. Looks at the twobasic methods of measuring robot…
Abstract
Examines the problem of the robot and fixture calibration from the perspective of simulation and off‐line programming. Looks at the two basic methods of measuring robot position—optical systems and cable‐driven systems—and describes examples of both of these methods. The Workspace PC‐based robot simulation system and the RoboTrak three‐cable measuring system for calibration are used as examples and compared with other commercial systems, and a calibration case study is presented. Concludes that if the accuracy required by a robot application of the order of 1 mm and the robot program is to be generated by an off‐line software package, then it is necessary to calibrate the robot first.
Details
Keywords
Yanwu Zhai, Haibo Feng and Yili Fu
This paper aims to present a pipeline to progressively deal with the online external parameter calibration and estimator initialization of the Stereo-inertial measurement unit…
Abstract
Purpose
This paper aims to present a pipeline to progressively deal with the online external parameter calibration and estimator initialization of the Stereo-inertial measurement unit (IMU) system, which does not require any prior information and is suitable for system initialization in a variety of environments.
Design/methodology/approach
Before calibration and initialization, a modified stereo tracking method is adopted to obtain a motion pose, which provides prerequisites for the next three steps. Firstly, the authors align the pose obtained with the IMU measurements and linearly calculate the rough external parameters and gravity vector to provide initial values for the next optimization. Secondly, the authors fix the pose obtained by the vision and restore the external and inertial parameters of the system by optimizing the pre-integration of the IMU. Thirdly, the result of the previous step is used to perform visual-inertial joint optimization to further refine the external and inertial parameters.
Findings
The results of public data set experiments and actual experiments show that this method has better accuracy and robustness compared with the state of-the-art.
Originality/value
This method improves the accuracy of external parameters calibration and initialization and prevents the system from falling into a local minimum. Different from the traditional method of solving inertial navigation parameters separately, in this paper, all inertial navigation parameters are solved at one time, and the results of the previous step are used as the seed for the next optimization, and gradually solve the external inertial navigation parameters from coarse to fine, which avoids falling into a local minimum, reduces the number of iterations during optimization and improves the efficiency of the system.
Details
Keywords
Dan Zhang, Junji Yuan, Haibin Meng, Wei Wang, Rui He and Sen Li
In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific…
Abstract
Purpose
In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.
Design/methodology/approach
Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.
Findings
The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.
Originality/value
This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.
Details
Keywords
Hui Pan, Na Li Wang and Yin Shi Qin
The purpose of this paper is to propose a method that calibrates the hand-eye relationship for eye-to-hand configuration and afterwards a rectification to improve the accuracy of…
Abstract
Purpose
The purpose of this paper is to propose a method that calibrates the hand-eye relationship for eye-to-hand configuration and afterwards a rectification to improve the accuracy of general calibration.
Design/methodology/approach
The hand-eye calibration of eye-to-hand configuration is summarized as a equation AX = XB which is the same as in eye-in-hand calibration. A closed-form solution is derived. To abate the impact of noise, a rectification is conducted after the general calibration.
Findings
Simulation and actual experiments confirm that the accuracy of calibration is obviously improved.
Originality/value
Only a calibration plane is required for the hand-eye calibration. Taking the impact of noise into account, a rectification is carried out after the general calibration and, as a result, that the accuracy is obviously improved. The method can be applied in many actual applications.
Details
Keywords
Mustafa Cakir and Cengiz Deniz
The purpose of this study is to present a novel method for industrial robot TCP (tool center point) calibration. The proposed method offers fully automated robot TCP calibration…
Abstract
Purpose
The purpose of this study is to present a novel method for industrial robot TCP (tool center point) calibration. The proposed method offers fully automated robot TCP calibration within a defined cycle time. The method is applicable for large-scale installations due to its zero cost for each robot.
Design/methodology/approach
Precise and expensive measuring equipment or specially designed reference devices are required for robot calibration. The calibration can be performed by using only one plane plate in this method, and the calibration procedure is defined step by step: the robot moves to the target plane position. Then, the TCP touches the plane and the actual robot configuration is recorded. Then robot moves back into position and the same step is repeated for a new sample. Alternatively, the robot can be stationary and the plane can be moved towards the robot TCP. TCP is calculated by processing the difference of the contact points recorded at different positions. The process is fully automated. No special equipment is used. The calculations are very simple, and the robot controller can easily be realized.
Findings
The conventional manual robot TCP calibration process takes about 15 min and takes more time in case of the high accuracy. The proposed method reduces this time to less than 3 min without operator support. Practical tests have shown that TCP calibration can be performed with 0.1-0.6 mm of accuracy. This solution is an automated process and does not require special installation and it also has approximately zero cost. For this reason, this study recommends using the proposed solution widely in areas where even one or hundreds of robots are located.
Research limitations/implications
In this study, the data were directly taken from the robot controller without using any special measuring equipment. The industrial robot used in the tests has no absolute calibration. The classical “four-point method” was used for reference TCP data. It is the initial acceptance that this process conducted with extreme care and by using a needle-tipped tool will not produce exact values. It was observed that deviation of the TCP from a fixed point in reorient motions was not more than 0.5 mm. This method has been validated for different bits. The pilot works for different robot applications in Ford Otosan Gölcük Plant have been completed and dissemination has started.
Originality/value
Although the approach uses is clear and simple, it is surprising that the calculation of TCP using plane equations has so far not been mentioned in the literature. The disadvantage of using either fixed point or sphere as a reference is that the TCP cannot automatically guide to the target. This problem was overcome with the use of a larger target plane plate and the process was fully automated. The proposed method can be widely used in practical applications.
Details
Keywords
Xin Ye, Jun Gao, Zhijing Zhang, Chao Shao and Guangyuan Shao
The purpose of this paper is to propose a sub-pixel calibration method for a microassembly system with coaxial alignment function (MSCA) because traditional sub-pixel calibration…
Abstract
Purpose
The purpose of this paper is to propose a sub-pixel calibration method for a microassembly system with coaxial alignment function (MSCA) because traditional sub-pixel calibration approaches cannot be used in this system.
Design/methodology/approach
The in-house microassembly system comprises a six degrees of freedom (6-DOF) large motion serial robot with microgrippers, a hexapod 6-DOF precision alignment worktable and a vision system whose optical axis of the microscope is parallel with the horizontal plane. A prism with special coating is fixed in front of the objective lens; thus, two parts’ Figures, namely the images of target and base part, can be acquired simultaneously. The relative discrepancy between the two parts can be calculated from image plane coordinate instead of calculating space transformation matrix. Therefore, the traditional calibration method cannot be applied in this microassembly system. An improved calibration method including the check corner detection solves the distortion coefficient conversely. This new way can detect the corner at sub-pixel accuracy. The experiment proves that the assembly accuracy of the coaxial microassembly system which has been calibrated by the new method can reach micrometer level.
Findings
The calibration results indicate that solving the distortion conversely could improve the assembly accuracy of MSCA.
Originality/value
The paper provides certain calibration methodological guidelines for devices with 2 dimensions or 2.5 dimensions, such as microelectromechanical systems devices, using MSCA.
Details
Keywords
Nicolas Andreff, Pierre Renaud, Philippe Martinet and Franc¸ois Pierrot
Presents the kinematic calibration of an H4 parallel prototype robot using a vision‐based measuring device. Calibration is performed according to the inverse kinematic model…
Abstract
Presents the kinematic calibration of an H4 parallel prototype robot using a vision‐based measuring device. Calibration is performed according to the inverse kinematic model method, using first the design model then a model developed for calibration purpose. To do so, the end‐effector pose (i.e. position and orientation) has to be measured with the utmost accuracy. Thus, first the practical accuracy of the low‐cost vision‐based measuring system is evaluated to have a precision in the order of magnitude of 10μ_it;m and 10−3° for a 1,024×768 pixel CCD camera. Second, the prototype is calibrated using the easy‐to‐install vision system, yielding a final positioning accuracy of the end‐effector reduced from more than 1cm down to less than 0.5mm. Also provides a discussion on the use of such a method on commercial systems.
Details
Keywords
Mingjun Zhang, Weimin Tao, William Fisher and Tzyh‐Jong Tarn
For semiconductor and gene‐chip microarray fabrication, robots are widely used to handle workpieces. It is critical that robots can calibrate themselves regularly and estimate…
Abstract
Purpose
For semiconductor and gene‐chip microarray fabrication, robots are widely used to handle workpieces. It is critical that robots can calibrate themselves regularly and estimate workpiece pose automatically. This paper proposes an industrial method for automatic robot calibration and workpiece pose estimation.
Design/methodology/approach
The methods have been implemented using an air‐pressure sensor and a laser sensor.
Findings
Experimental results conducted in an industrial manufacturing environment show efficiency of the methods.
Originality/value
The contribution of this paper consists of an industrial solution to automatic robot calibration and workpiece pose estimation for automatic semiconductor and gene‐chip microarray fabrication.
Details