This paper aims to present a one-shot gesture recognition approach which can be a high-efficient communication channel in human–robot collaboration systems.
This paper applies dynamic time warping (DTW) to align two gesture sequences in temporal domain with a novel frame-wise distance measure which matches local features in spatial domain. Furthermore, a novel and robust bidirectional attention region extraction method is proposed to retain information in both movement and hold phase of a gesture.
The proposed approach is capable of providing efficient one-shot gesture recognition without elaborately designed features. The experiments on a social robot (JiaJia) demonstrate that the proposed approach can be used in a human–robot collaboration system flexibly.
According to previous literature, there are no similar solutions that can achieve an efficient gesture recognition with simple local feature descriptor and combine the advantages of local features with DTW.
Kuang, Y., Cheng, H., Zheng, Y., Cui, F. and Huang, R. (2020), "One-shot gesture recognition with attention-based DTW for human-robot collaboration", Assembly Automation, Vol. 40 No. 1, pp. 40-47. https://doi.org/10.1108/AA-11-2018-0228
Emerald Publishing Limited
Copyright © 2019, Emerald Publishing Limited