To read this content please select one of the options below:

PP-GraspNet: 6-DoF grasp generation in clutter using a new grasp representation method

Enbo Li (Harbin Institute of Technology, Harbin, China)
Haibo Feng (School of Mechatronics Engineering, Harbin Institute of Technology, Harbin, China)
Yili Fu (State Key Laboratory of Robotics and System, Harbin Industry University, Harbin, China)

Industrial Robot

ISSN: 0143-991x

Article publication date: 2 January 2023

30

Abstract

Purpose

The grasping task of robots in dense cluttered scenes from a single-view has not been solved perfectly, and there is still a problem of low grasping success rate. This study aims to propose an end-to-end grasp generation method to solve this problem.

Design/methodology/approach

A new grasp representation method is proposed, which cleverly uses the normal vector of the table surface to derive the grasp baseline vectors, and maps the grasps to the pointed points (PP), so that there is no need to add orthogonal constraints between vectors when using a neural network to predict rotation matrixes of grasps.

Findings

Experimental results show that the proposed method is beneficial to the training of the neural network, and the model trained on synthetic data set can also have high grasping success rate and completion rate in real-world tasks.

Originality/value

The main contribution of this paper is that the authors propose a new grasp representation method, which maps the 6-DoF grasps to a PP and an angle related to the tabletop normal vector, thereby eliminating the need to add orthogonal constraints between vectors when directly predicting grasps using neural networks. The proposed method can generate hundreds of grasps covering the whole surface in about 0.3 s. The experimental results show that the proposed method has obvious superiority compared with other methods.

Keywords

Citation

Li, E., Feng, H. and Fu, Y. (2023), "PP-GraspNet: 6-DoF grasp generation in clutter using a new grasp representation method", Industrial Robot, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/IR-08-2022-0196

Publisher

:

Emerald Publishing Limited

Copyright © 2022, Emerald Publishing Limited

Related articles