Language interface

Kybernetes

ISSN: 0368-492X

Article publication date: 1 July 1999

280

Keywords

Citation

Rudall, B.H. (1999), "Language interface", Kybernetes, Vol. 28 No. 5. https://doi.org/10.1108/k.1999.06728eaa.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 1999, MCB UP Limited


Language interface

Keywords Automation, Cybernetics, Research, Technological developments

Abstract Reports and surveys are given of selected current research and development in systems and cybernetics. They include: Language interface, Automated automobile, Innovative space technology, Software reliability and safety, Automatic analysis of handwritten documents, High-tech musical instruments, Biological motors, Interplay between smell and the mind, Cybernetics and automation.

Language interface

The importance of gesticulating when communicating with other humans or even machines has been underestimated by many researchers involved in building interfaces. There is no doubt that gesticulating as you speak conveys a great deal of extra meaning which in the case of human interaction is unconsciously understood. Nowadays even human-machine interfaces take account of what is a natural human communicating action.

A recent report by psychologists at the University of Manchester, UK, says that audiences unconsciously absorb gestures. It suggests that:

Waggling, waving, twiddling and rotating gestures may appear to be distracting but even the most meaningless of movements add sufficient extra detail to improve understanding by 10 per cent, tests have shown.

For psychologists this research may explain why so many people make gestures while they speak. One theory appears to suggest that gesturing assists speakers to process their ideas into language whilst another tells us that a gesture is of value to the listener and not the speaker.

The tests used to back up the findings of Manchester University researchers included filming people as they described a cartoon sequence they had seen. One group was shown the cartoon minus the soundtrack so that the only source of information was the accompanying gestures. Another group saw the cartoon with the sound track intact. A third group was played only the soundtrack. Then each member of the audience for the testing was questioned. The results were:

  • the group that heard the description were able to answer half the questions correctly;

  • those who only saw the gestures answered one in five of the questions asked; and

  • those who saw and heard answered more that 60 per cent of the questions asked.

Professor Geoffrey Beattie of Manchester University who led the research team says that:

The gestures had conveyed information about size, space, speeds, angles and methods of doing things. Despite the fact that they looked like nothing but a repetitive hand motion they were nevertheless packed with meaning. Even though you think you just waggle your hands around there are all kinds of things being communicated. The gesture is not an attempt to repair something that is going wrong in speech. Part of the meaning is going quite naturally into gesture.

In a paper published in the British Journal of Psychology (February, 1999) Professor Beattie describes an experiment that shows that gesticulating is of little help to the speaker. The report says that:

People were asked to think of the word that matched a definition, such as "an implement with which you measure angles", and were filmed as they did this. Half the group were told to sit on their hands while they did this. It was found that those who were allowed to gesticulate were no quicker in finding the correct word.

The fact is that people seem to gesticulate all the time, even when talking on the telephone or even to a blind person.

These researches will assist the designers of interfaces who are intent on producing the most effective communication systems whether it be for humans, machines or human-machine interactions. In the case of human-machine interfaces great strides have been made in constructing interfaces that are able to recognise and interpret not only chosen communicating languages but also facial expressions and gestures. Such interfaces require the designer to be versed in many disciplines, with psychology being just one.

Apart from the current desire to produce human-computer interfaces that can mimic the human skills of recognising and understanding communications from computer-users there are many other important applications in automation and robotics. Robots, for example, that are constructed for purposes involving humans may need to recognise not only the language of a human but the way in which it is spoken and presented. In automation systems where human users and operators are participants a similar understanding of the way we communicate is an essential requirement. The results will be that sophisticated systems will have to be developed if humans and machines are to work together. This is evident in many instances, for example, the use of the sentence "Will you never give up" illustrates how the tone of voice with appropriate emphasis and even the facial expression and accompanying gestures can imply so many different meanings. The correct interpretation of such linguistic constructions may one day be better performed by machine than the human.

Related articles