Search results

1 – 10 of over 4000
Article
Publication date: 1 December 1995

Don Braggins

Discusses the background of robot vision systems and examines whyvision‐guided motion for robots hasn’t lived up to the earlypromise. Outlines the different types of robot vision…

Abstract

Discusses the background of robot vision systems and examines why vision‐guided motion for robots hasn’t lived up to the early promise. Outlines the different types of robot vision available and considers the limitation of “computer vision” in most commercial applications. Looks at the difficulties of making effective use of information from a two‐dimensional vision system to guide a robot working in a 3‐dimensional environment and at some of the possible solutions. Discusses future developments and concludes that in the short term, it is probably the opening up of programming to a larger group of potential users, with the facility of graphic user interface, which will have the greatest impact on the uptake of vision for robots.

Details

Industrial Robot: An International Journal, vol. 22 no. 6
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 January 1991

D.F.H. Wolfe, S.W. Wijesoma and R.J. Richards

Tasks in automated manufacturing and assembly increasingly involve robot operations guided by vision systems. The traditional “look‐and‐move” approach to linking machine vision…

Abstract

Tasks in automated manufacturing and assembly increasingly involve robot operations guided by vision systems. The traditional “look‐and‐move” approach to linking machine vision systems and robot manipulators which is generally used in these operations relies heavily on accurate camera to real‐world calibration processes and on highly accurate robot arms with well‐known kinematics. As a consequence, the cost of robot automation has not been justifiable in many applications. This article describes a novel real‐time vision control strategy giving “eye‐to‐hand co‐ordination” which offers good performance even in the presence of significant vision system miscalibrations and kinematic model parametric errors. This strategy offers the potential for low cost vision‐guided robots.

Details

Assembly Automation, vol. 11 no. 1
Type: Research Article
ISSN: 0144-5154

Article
Publication date: 23 November 2022

Chetan Jalendra, B.K. Rout and Amol Marathe

Industrial robots are extensively used in the robotic assembly of rigid objects, whereas the assembly of flexible objects using the same robot becomes cumbersome and challenging…

Abstract

Purpose

Industrial robots are extensively used in the robotic assembly of rigid objects, whereas the assembly of flexible objects using the same robot becomes cumbersome and challenging due to transient disturbance. The transient disturbance causes vibration in the flexible object during robotic manipulation and assembly. This is an important problem as the quick suppression of undesired vibrations reduces the cycle time and increases the efficiency of the assembly process. Thus, this study aims to propose a contactless robot vision-based real-time active vibration suppression approach to handle such a scenario.

Design/methodology/approach

A robot-assisted camera calibration method is developed to determine the extrinsic camera parameters with respect to the robot position. Thereafter, an innovative robot vision method is proposed to identify a flexible beam grasped by the robot gripper using a virtual marker and obtain the dimension, tip deflection as well as velocity of the same. To model the dynamic behaviour of the flexible beam, finite element method (FEM) is used. The measured dimensions, tip deflection and velocity of a flexible beam are fed to the FEM model to predict the maximum deflection. The difference between the maximum deflection and static deflection of the beam is used to compute the maximum error. Subsequently, the maximum error is used in the proposed predictive maximum error-based second-stage controller to send the control signal for vibration suppression. The control signal in form of trajectory is communicated to the industrial robot controller that accommodates various types of delays present in the system.

Findings

The effectiveness and robustness of the proposed controller have been validated using simulation and experimental implementation on an Asea Brown Boveri make IRB 1410 industrial robot with a standard low frame rate camera sensor. In this experiment, two metallic flexible beams of different dimensions with the same material properties have been considered. The robot vision method measures the dimension within an acceptable error limit i.e. ±3%. The controller can suppress vibration amplitude up to approximately 97% in an average time of 4.2 s and reduces the stability time up to approximately 93% while comparing with control and without control suppression time. The vibration suppression performance is also compared with the results of classical control method and some recent results available in literature.

Originality/value

The important contributions of the current work are the following: an innovative robot-assisted camera calibration method is proposed to determine the extrinsic camera parameters that eliminate the need for any reference such as a checkerboard, robotic assembly, vibration suppression, second-stage controller, camera calibration, flexible beam and robot vision; an approach for robot vision method is developed to identify the object using a virtual marker and measure its dimension grasped by the robot gripper accommodating perspective view; the developed robot vision-based controller works along with FEM model of the flexible beam to predict the tip position and helps in handling different dimensions and material types; an approach has been proposed to handle different types of delays that are part of implementation for effective suppression of vibration; proposed method uses a low frame rate and low-cost camera for the second-stage controller and the controller does not interfere with the internal controller of the industrial robot.

Details

Industrial Robot: the international journal of robotics research and application, vol. 50 no. 3
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 28 June 2018

Haibo Feng, Yanwu Zhai and Yili Fu

Surgical robot systems have been used in single-port laparoscopy (SPL) surgery to improve patient outcomes. This study aims to develop a vision robot system for SPL surgery to…

Abstract

Purpose

Surgical robot systems have been used in single-port laparoscopy (SPL) surgery to improve patient outcomes. This study aims to develop a vision robot system for SPL surgery to effectively improve the visualization of surgical robot systems for relatively complex surgical procedures.

Design/methodology/approach

In this paper, a new master-slave magnetic anchoring vision robotic system for SPL surgery was proposed. A lighting distribution analysis for the imaging unit of the vision robot was carried out to guarantee illumination uniformity in the workspace during SPL surgery. Moreover, cleaning force for the lens of the camera was measured to assess safety for an abdominal wall, and performance assessment of the system was performed.

Findings

Extensive experimental results for illumination, control, cleaning force and functionality test have indicated that the proposed system has an excellent performance in providing the visual feedback.

Originality/value

The main contribution of this paper lies in the development of a magnetic anchoring vision robot system that successfully improves the ability of cleaning the lens and avoiding the blind area in a field of view.

Details

Industrial Robot: An International Journal, vol. 45 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 December 1995

Justin Testa

Looks at the move towards integrating robots with highperformance, fullyprogrammable vision systems. Outlines the problems of traditionalvision‐aided robotics and the advantage of…

192

Abstract

Looks at the move towards integrating robots with highperformance, fully programmable vision systems. Outlines the problems of traditional vision‐aided robotics and the advantage of modern machine vision technology. The latest generation of machine vision systems combine the capabilities of the “C” program system with graphic “point‐and Click” application development environments based on Microsoft Windows: the Checkpoint system. Describes how the Checkpoint vision systems works and the applications of the new vision guided robots. Concludes that the new systems now make it possible for users and system integrators to being the advantages of vision‐guided robotics to general manufacturing.

Details

Industrial Robot: An International Journal, vol. 22 no. 6
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 April 1994

John Pretlove

Describes research into the use of external sensors for robot systems toallow them to react intelligently to unforeseen events in the productionprocess and irregularities in…

217

Abstract

Describes research into the use of external sensors for robot systems to allow them to react intelligently to unforeseen events in the production process and irregularities in products. Examines the use of active vision systems with robot controllers and the integration of the two systems. Concludes that this enhances the ability of an industrial robot system to cope with variations and unforeseen circumstances in the workcell or the workpiece.

Details

Industrial Robot: An International Journal, vol. 21 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 15 June 2012

Xi‐Zhang Chen, Yu‐Ming Huang and Shan‐ben Chen

Stereo vision technique simulates the function of the human eyes to observe the world, which can be used to compute the spatial information of weld seam in the robot welding…

Abstract

Purpose

Stereo vision technique simulates the function of the human eyes to observe the world, which can be used to compute the spatial information of weld seam in the robot welding field. It is a typical kind of application to fix two cameras on the end effector of robot when stereo vision is used in intelligent robot welding. In order to analyse the effect of vision system configuration on vision computing, an accuracy analysis model of vision computing is constructed, which is a good guide for the construction and application of stereo vision system in welding robot field.

Design/methodology/approach

A typical stereo vision system fixed on welding robot is designed and constructed to compute the position information of spatial seam. A simplified error analysis model of the two arbitrary putting cameras is built to analyze the effect of sensors' structural parameter on vision computing accuracy. The methodology of model analysis and experimental verification are used in the research. And experiments related with image extraction, robot movement accuracy is also designed to analyze the effect of equipment accuracy and related processed procedure in vision technology.

Findings

Effect of repeatability positioning accuracy and TCP calibration error of welding robot for visual computing are also analyzed and tested. The results show that effect of the repeatability on computing accuracy is not bigger than 0.3 mm. However, TCP affected the computing accuracy greatly, when the calibrated error of TCP is bigger than 0.5, the re‐calibration is very necessary. The accuracy analysis and experimental technique in this paper can guide the research of three‐dimensional information computing by stereo vision and improve the computed accuracy.

Originality/value

The accuracy of seam position information is affected by many interactional factors, the systematic experiments and a simplified error analysis model are designed and established, the main factors such as the sensor's configurable parameters, the accuracy of arc welding robot and the accuracy of image recognition, are included in the model and experiments. The model and experimental method are significant for design of visual sensor and improvement of computing accuracy.

Details

Industrial Robot: An International Journal, vol. 39 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 December 2005

Christine Connolly

Examines a recently launched integration of smart cameras into industrial robots to make them responsive to a changing environment.

Abstract

Purpose

Examines a recently launched integration of smart cameras into industrial robots to make them responsive to a changing environment.

Design/methodology/approach

Reviews the capabilities of the vision‐enabled robot, citing installations in Sweden and the UK, then describes the robot and vision programming procedure.

Findings

Vision integration opens up a range of new possibilities such as simultaneous product handling and inspection, as well as providing real‐time robot guidance. Standardisation plays an extremely valuable role in building integrated systems from disparate technological elements. Here ActiveX web standards, ethernet connectivity, a standard interchangeable family of cameras and a common controller for a whole range of robots are the keys to the synthesis of a powerful new combination of robot and machine vision.

Originality/value

Draws to the attention of industrial engineers the availability of a family of robots with integrated machine vision.

Details

Industrial Robot: An International Journal, vol. 32 no. 6
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 11 January 2008

Richard Bloss

This paper aims to review the 2007 Bi‐Annual International Robotics and Vision Show in Chicago.

1533

Abstract

Purpose

This paper aims to review the 2007 Bi‐Annual International Robotics and Vision Show in Chicago.

Design/methodology/approach

The paper presents in‐depth interviews with exhibitors of robots and robotic vision systems.

Findings

The marriage of vision and robotics is changing the nature of robotics. In the past, a robot was a “dumb” mechanism which followed pre‐programmed directions. With the addition of vision and added computing power, the robot is starting to “find its way” in the application. Innovations in grippers, robotic design and simulation also shared the stage.

Practical implications

Applications, which previously did not lend themselves to a robotic solution because the requirement could not be completely pre‐programmed, may now be possible. With vision, the robot can adjust for uncertainties in real‐time and perform other tasks such as inspection.

Originality/value

This paper reviews the 2007 Bi‐Annual International Robotics and Vision Show, Chicago.

Details

Industrial Robot: An International Journal, vol. 35 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 28 March 2023

Cengiz Deniz

The aim of this study is to create a robust and simple collision avoidance approach based on quaternion algebra for vision-based pick and place applications in manufacturing…

Abstract

Purpose

The aim of this study is to create a robust and simple collision avoidance approach based on quaternion algebra for vision-based pick and place applications in manufacturing industries, specifically for use with industrial robots and collaborative robots (cobots).

Design/methodology/approach

In this study, an approach based on quaternion algebra is developed to prevent any collision or breakdown during the movements of industrial robots or cobots in vision system included pick and place applications. The algorithm, integrated into the control system, checks for collisions before the robot moves its end effector to the target position during the process flow. In addition, a hand–eye calibration method is presented to easily calibrate the camera and define the geometric relationships between the camera and the robot coordinate systems.

Findings

This approach, specifically designed for vision-based robot/cobot applications, can be used by developers and robot integrator companies to significantly reduce application costs and the project timeline of the pick and place robotics system installation. Furthermore, the approach ensures a safe, robust and highly efficient application for robotics vision applications across all industries, making it an ideal solution for various industries.

Originality/value

The algorithm for this approach, which can be operated in a robot controller or a programmable logic controller, has been tested as real-time in vision-based robotics applications. It can be applied to both existing and new vision-based pick and place projects with industrial robots or collaborative robots with minimal effort, making it a cost-effective and efficient solution for various industries.

Details

Industrial Robot: the international journal of robotics research and application, vol. 50 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

1 – 10 of over 4000