Issues and experimental results in vision‐guided robotic grasping of static or moving objects

Nikolaos Papanikolopoulos (Associate Professor, Department of Computer Science and Engineering, University of Minnesota, 200 Union Street SE, 4‐192 EE/CS Building, Minneapolis, MN 55455, USA. Tel: 00 1 612 625 0163; Fax: 00 1 612 625 0572; E‐mail: npapas@cs.umn.edu)
Christopher E. Smith (Assistant Professor, Department of Computer Science and Engineering, University of Colorado, Campus Box 109, PO Box 173364, Denver, CO, USA. Tel: 00 1 303 556 4314; Fax: 00 1 303 556 8369; E‐mail: chsmith@carbon.cudenver.edu)

Industrial Robot

ISSN: 0143-991x

Publication date: 1 April 1998

Abstract

Many research efforts have turned to sensing, and in particular computer vision, to create more flexible robotic systems. Computer vision is often required to provide data for the grasping of a target. Using a vision system for grasping of static or moving objects presents several issues with respect to sensing, control, and system configuration. This paper presents some of these issues in concept with the options available to the researcher and the trade‐offs to be expected when integrating a vision system with a robotic system for the purpose of grasping objects. The paper includes a description of our experimental system and contains experimental results from a particular configuration that characterize the type and frequency of errors encountered while performing various vision‐guided grasping tasks. These error classes and their frequency of occurrence lend insight into the problems encountered during visual grasping and into the possible solution of these problems.

Keywords

Citation

Papanikolopoulos, N. and Smith, C. (1998), "Issues and experimental results in vision‐guided robotic grasping of static or moving objects", Industrial Robot, Vol. 25 No. 2, pp. 134-140. https://doi.org/10.1108/01439919810204748

Download as .RIS

Publisher

:

MCB UP Ltd

Copyright © 1998, MCB UP Limited

Please note you might not have access to this content

You may be able to access this content by login via Shibboleth, Open Athens or with your Emerald account.
If you would like to contact us about accessing this content, click the button and fill out the form.
To rent this content from Deepdyve, please click the button.