TY - JOUR AB - Purpose This paper aims to present a one-shot gesture recognition approach which can be a high-efficient communication channel in human–robot collaboration systems.Design/methodology/approach This paper applies dynamic time warping (DTW) to align two gesture sequences in temporal domain with a novel frame-wise distance measure which matches local features in spatial domain. Furthermore, a novel and robust bidirectional attention region extraction method is proposed to retain information in both movement and hold phase of a gesture.Findings The proposed approach is capable of providing efficient one-shot gesture recognition without elaborately designed features. The experiments on a social robot (JiaJia) demonstrate that the proposed approach can be used in a human–robot collaboration system flexibly.Originality/value According to previous literature, there are no similar solutions that can achieve an efficient gesture recognition with simple local feature descriptor and combine the advantages of local features with DTW. VL - 40 IS - 1 SN - 0144-5154 DO - 10.1108/AA-11-2018-0228 UR - https://doi.org/10.1108/AA-11-2018-0228 AU - Kuang Yiqun AU - Cheng Hong AU - Zheng Yali AU - Cui Fang AU - Huang Rui PY - 2019 Y1 - 2019/01/01 TI - One-shot gesture recognition with attention-based DTW for human-robot collaboration T2 - Assembly Automation PB - Emerald Publishing Limited SP - 40 EP - 47 Y2 - 2024/03/28 ER -