Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users

José González (Graduate School of Engineering, Chiba University, Chiba, Japan)
Wenwei Yu (Graduate School of Engineering, Chiba University, Chiba, Japan)
Alejandro Hernandez Arieta (Department of Informatics, University of Zurich, Zurich, Switzerland)

Industrial Robot

ISSN: 0143-991x

Publication date: 8 March 2010



It is widely agreed that amputees have to rely on visual input to monitor and control the position of the prosthesis while reaching and grasping because of the lack of proprioceptive feedback. Therefore, visual information has been a prerequisite for prosthetic hand biofeedback studies. This is why, the underlying characteristics of other artificial feedback methods used to this day, such as auditive, electro‐tactile, or vibro‐tactile feedback, has not been clearly explored. The purpose of this paper is to explore whether it is possible to use audio feedback alone to convey more than one independent variable (multichannel) simultaneously, without relying on the vision, to improve the learning of a new perceptions, in this case, to learn and understand the artificial proprioception of a prosthetic hand while reaching.


Experiments are conducted to determine whether the audio signals could be used as a multi‐variable dynamical sensory substitution in reaching movements without relying on the visual input. Two different groups are tested, the first one uses only audio information and the second one uses only visual information to convey computer‐simulated trajectories of two fingers.


The results show that it is possible to use auditive feedback to convey artificial proprioceptive information instead of vision as a guide, thus assist users by internalizing new perceptions.


This way, the strong and weak points of auditive feedback can be observed and can be used to improve future feedback systems or schemes, which can integrate different feedback methods to provide more information to the user.



González, J., Yu, W. and Hernandez Arieta, A. (2010), "Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users", Industrial Robot, Vol. 37 No. 2, pp. 148-156.

Download as .RIS



Emerald Group Publishing Limited

Copyright © 2010, Emerald Group Publishing Limited

Please note you might not have access to this content

You may be able to access this content by login via Shibboleth, Open Athens or with your Emerald account.
If you would like to contact us about accessing this content, click the button and fill out the form.
To rent this content from Deepdyve, please click the button.