Outcome with a novel methodology for online recognition and classification of pieces in robotic assembly tasks and its application into an intelligent manufacturing cell.
The performance of industrial robots working in unstructured environments can be improved using visual perception and learning techniques. The object recognition is accomplished using an artificial neural network (ANN) architecture which receives a descriptive vector called CFD&POSE as the input. Experimental results were done within a manufacturing cell and assembly parts.
Find this vector represents an innovative methodology for classification and identification of pieces in robotic tasks, obtaining fast recognition and pose estimation information in real time. The vector compresses 3D object data from assembly parts and it is invariant to scale, rotation and orientation, and it also supports a wide range of illumination levels.
Provides vision guidance in assembly tasks, current work addresses the use of ANN's for assembly and object recognition separately, future work is oriented to use the same neural controller for all different sensorial modes.
Intelligent manufacturing cells developed with multimodal sensor capabilities, might use this methodology for future industrial applications including robotics fixtureless assembly. The approach in combination with the fast learning capability of ART networks indicates the suitability for industrial robot applications as it is demonstrated through experimental results.
This paper introduces a novel method which uses collections of 2D images to obtain a very fast feature data – ”current frame descriptor vector” – of an object by using image projections and canonical forms geometry grouping for invariant object recognition.
Peña‐Cabrera, M., Lopez‐Juarez, I., Rios‐Cabrera, R. and Corona‐Castuera, J. (2005), "Machine vision approach for robotic assembly", Assembly Automation, Vol. 25 No. 3, pp. 204-216. https://doi.org/10.1108/01445150510610926Download as .RIS
Emerald Group Publishing Limited
Copyright © 2005, Emerald Group Publishing Limited