The purpose of this paper is to design an interactive industrial robotic system which can be used to assist a “layperson” in re‐casting a generic pick‐and‐place application. A user can program a pick‐and‐place application simply by pointing to objects in the work area and speaking simple and intuitive natural language commands.
The system was implemented in C# using the EMGU wrapper classes for OpenCV as well as the MS Speech Recognition API. The target language to be recognized was modelled using traditional augmented transition networks which were implemented as XML Grammars. The authors developed an original finger‐pointing algorithm using a unique combination of standard morphological and image processing techniques. Recognized voice commands trigger the vision component to capture what a user is pointing at. If the specified action requires robot movement, the required information is sent to the robot control component of the system, which then transmits the commands to the robot controller for execution.
The voice portion of the system was tested on the factory floor in a “typical” manufacturing environment, which was right at the maximum allowable average decibel level specified by OSHA. The findings show that a modern/standard MS Speech API voice recognition system can achieve a 100 per cent accuracy of simple commands; although at the noisy levels of 89 decibels on average, every one out of six commands had to be repeated. The vision component was test of 72 test subjects who had no prior knowledge of this work. The system accurately recognized what the test subjects were pointing at 95 per cent of the time within five seconds of hand readjusting.
The vision component suffers from the “typical” problems: very shiny surfaces can cause problems; very poor contrast between the pointing hand and the background; and occlusions. Currently the system can only handle a limited amount of depth recovery using a spring mounted gripper. A second camera (future work) needs to be incorporated in order to handle large depth variations in the work area.
This system could have a huge impact on how factory floor workers interact with robotic equipment.
The testing of the voice system on a factory floor, although simple, is very important. It proves the viability of this component of the system and debunks arguments that factories are simply too noisy for current voice technology. The unique finger‐pointing algorithm developed by the authors is also an important contribution to the field. In particular, the manner in which the pointing vector was constructed. Furthermore, very few papers report results of non‐experts using their pointing algorithms. The paper reports concrete results that show the system is intuitive and user friendly to “laypersons”.
van Delden, S., Umrysh, M., Rosario, C. and Hess, G. (2012), "Pick‐and‐place application development using voice and visual commands", Industrial Robot, Vol. 39 No. 6, pp. 592-600. https://doi.org/10.1108/01439911211268796Download as .RIS
Emerald Group Publishing Limited
Copyright © 2012, Emerald Group Publishing Limited