Search results
1 – 10 of 153Wenzhen Yang, Johan K. Crone, Claus R. Lønkjær, Macarena Mendez Ribo, Shuo Shan, Flavia Dalia Frumosu, Dimitrios Papageorgiou, Yu Liu, Lazaros Nalpantidis and Yang Zhang
This study aims to present a vision-guided robotic system design for application in vat photopolymerization additive manufacturing (AM), enabling vat photopolymerization AM hybrid…
Abstract
Purpose
This study aims to present a vision-guided robotic system design for application in vat photopolymerization additive manufacturing (AM), enabling vat photopolymerization AM hybrid with injection molding process.
Design/methodology/approach
In the system, a robot equipped with a camera and a custom-made gripper as well as driven by a visual servoing (VS) controller is expected to perceive objective, handle variation, connect multi-process steps in soft tooling process and realize automation of vat photopolymerization AM. Meanwhile, the vat photopolymerization AM printer is customized in both hardware and software to interact with the robotic system.
Findings
By ArUco marker-based vision-guided robotic system, the printing platform can be manipulated in arbitrary initial position quickly and robustly, which constitutes the first step in exploring automation of vat photopolymerization AM hybrid with soft tooling process.
Originality/value
The vision-guided robotic system monitors and controls vat photopolymerization AM process, which has potential for vat photopolymerization AM hybrid with other mass production methods, for instance, injection molding.
Details
Keywords
There are many methods and solutions to improve any process, and stay within the established control limits. Statistically analyzed data, received from inspection sensors and…
Abstract
There are many methods and solutions to improve any process, and stay within the established control limits. Statistically analyzed data, received from inspection sensors and other devices, are one of them. Automated inspection sensors and systems are effective tools for controlling variation and obtaining process related knowledge. Automated inspection methods, discussed in this paper, represent an important, nevertheless not the only methods, that lead to process improvement, the ultimate goal of total quality management and control (TQM/TQC). Inspection, as part of the feedback control loop of the overall TQM/TQC process, involves the continual satisfaction of customer requirements at lowest cost by harnessing the efforts of everybody in the company. The key question in any inspection system is as follows: Are the measured values within tolerance, or not, and if they are outside the tolerance limits, why did we produce those parts in the first place?
Details
Keywords
This paper aims to provide a background to the use of robots by the automotive industry and describe a number of applications which illustrate the capabilities and importance of…
Abstract
Purpose
This paper aims to provide a background to the use of robots by the automotive industry and describe a number of applications which illustrate the capabilities and importance of robotic machine vision technology.
Design/methodology/approach
Following an historical background to the use of robots by the automotive industry, this paper discusses a selection of applications which involve the use of robotic machine vision. Brief conclusions are drawn.
Findings
This shows that robotic vision technology is playing an important and growing role within the automotive industry and can yield improved product quality and greater productivity.
Originality/value
This paper illustrates how robots equipped with machine vision are contributing to the automotive industry's needs for greater productivity and improved quality.
Details
Keywords
This paper aims to review the use of imaging technologies in robotics, with an emphasis on inspection applications and the control of autonomous robots.
Abstract
Purpose
This paper aims to review the use of imaging technologies in robotics, with an emphasis on inspection applications and the control of autonomous robots.
Design/methodology/approach
Following a brief introduction, this paper first considers vision‐based robotic inspection systems and highlights a selection of recent applications. Second, it considers the use of vision in autonomous robot navigation and discusses some of the challenges and recent developments.
Findings
This shows that developments in machine vision have led to vision systems being used in a diversity of component‐level and in‐service robotic inspection tasks. It also illustrates that vision systems have a key role to play in the emerging generation of autonomous, mobile robots.
Originality/value
This paper provides a review of recent developments in vision‐based robotic inspection and autonomous, mobile robot navigation.
Details
Keywords
Abstract
Details
Keywords
Erhan Ada, Halil Kemal Ilter, Muhittin Sagnak and Yigit Kazancoglu
The main aim of this study is to understand the role of smart technologies and show the rankings of various smart technologies in collection and classification of electronic waste…
Abstract
Purpose
The main aim of this study is to understand the role of smart technologies and show the rankings of various smart technologies in collection and classification of electronic waste (e-waste).
Design/methodology/approach
This study presents a framework integrating the concepts of collection and classification mechanisms and smart technologies. The criteria set includes three main, which are economic, social and environmental criteria, including a total of 15 subcriteria. Smart technologies identified in this study were robotics, multiagent systems, autonomous tools, smart vehicles, data-driven technologies, Internet of things (IOT), cloud computing and big data analytics. The weights of all criteria were found using fuzzy analytic network process (ANP), and the scores of smart technologies which were useful for collection and classification of e-waste were calculated using fuzzy VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR).
Findings
The most important criterion was found as collection cost, followed by pollution prevention and control, storage/holding cost and greenhouse gas emissions in collection and classification of e-waste. Autonomous tools were found as the best smart technology for collection and classification of e-waste, followed by robotics and smart vehicles.
Originality/value
The originality of the study is to propose a framework, which integrates the collection and classification of e-waste and smart technologies.
Details
Keywords
Nikolaos Papanikolopoulos and Christopher E. Smith
Many research efforts have turned to sensing, and in particular computer vision, to create more flexible robotic systems. Computer vision is often required to provide data for the…
Abstract
Many research efforts have turned to sensing, and in particular computer vision, to create more flexible robotic systems. Computer vision is often required to provide data for the grasping of a target. Using a vision system for grasping of static or moving objects presents several issues with respect to sensing, control, and system configuration. This paper presents some of these issues in concept with the options available to the researcher and the trade‐offs to be expected when integrating a vision system with a robotic system for the purpose of grasping objects. The paper includes a description of our experimental system and contains experimental results from a particular configuration that characterize the type and frequency of errors encountered while performing various vision‐guided grasping tasks. These error classes and their frequency of occurrence lend insight into the problems encountered during visual grasping and into the possible solution of these problems.
Details
Keywords
Looks at the move towards integrating robots with highperformance, fullyprogrammable vision systems. Outlines the problems of traditionalvision‐aided robotics and the advantage of…
Abstract
Looks at the move towards integrating robots with highperformance, fully programmable vision systems. Outlines the problems of traditional vision‐aided robotics and the advantage of modern machine vision technology. The latest generation of machine vision systems combine the capabilities of the “C” program system with graphic “point‐and Click” application development environments based on Microsoft Windows: the Checkpoint system. Describes how the Checkpoint vision systems works and the applications of the new vision guided robots. Concludes that the new systems now make it possible for users and system integrators to being the advantages of vision‐guided robotics to general manufacturing.
Details
Keywords
Describes an application using machine vision as a sensing device to allow robots to adapt according to their use. Describes the application in focus, lists the requirements of…
Abstract
Describes an application using machine vision as a sensing device to allow robots to adapt according to their use. Describes the application in focus, lists the requirements of the cell and provides the solution. Provides details of the system used, emphasizing in particular its adaptability.
Details
Keywords
The capability to perform dexterous operations in an autonomous manner would greatly enhance the productivity of robotic operations. In this paper, we present a new methodology…
Abstract
Purpose
The capability to perform dexterous operations in an autonomous manner would greatly enhance the productivity of robotic operations. In this paper, we present a new methodology for vision‐based grasping of objects or parts using a three‐finger hand as a gripper of a robotic manipulator.
Design/methodology/approach
The hand employed in our work, called SARAH, was designed for robotic operations on the space station, however, the main steps of our procedure can be applied for tasks in a manufacturing environment. Our methodology involves two principal stages: automatic synthesis of grasps for planar and revolute objects with SARAH and vision‐based pose estimation of the object to be grasped. For both stages, we assume that a model of the object is available off‐line.
Findings
In the paper, numerical results are presented for grasp synthesis of several objects with SARAH to demonstrate the feasibility and optimality of the synthesized grasps. Experimental results are also obtained with SARAH as the end‐effector of a seven‐degree‐of‐freedom robotic arm, demonstrating the feasibility of the integrated vision‐based grasping.
Research limitations/implications
The methodology described in the paper, although represents a substantial step towards automated grasping with a robotic manipulator, still requires some decision making from the user. Further work can improve the pose identification aspects of the algorithm to make them more robust and free of human intervention. As well, the grasp synthesis procedure can be expanded to handle more complex and possibly moving objects, as well as to allow for different grasp types than those considered here.
Practical implications
The work demonstrates feasibility of autonomous grasp execution in industrial setting by using a three‐finger hand as a robotic gripper.
Originality/value
The results presented in the paper demonstrate the feasibility of synthesising optimised grasps which take into account the kinematics of the gripper. We also demonstrate a real implementation of vision‐based grasping by using a robotic manipulator with a three‐finger hand.
Details