PP-GraspNet: 6-DoF grasp generation in clutter using a new grasp representation method

被引:0
|
作者
Li, Enbo [1 ]
Feng, Haibo [2 ]
Fu, Yili [3 ]
机构
[1] Harbin Inst Technol, Harbin, Peoples R China
[2] Harbin Inst Technol, Sch Mechatron Engn, Harbin, Peoples R China
[3] Harbin Ind Univ, State Key Lab Robot & Syst, Harbin, Peoples R China
关键词
Deep learning; Robotic grasping; Grasp detection; Grasp generation; Pick and place;
D O I
10.1108/IR-08-2022-0196
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
PurposeThe grasping task of robots in dense cluttered scenes from a single-view has not been solved perfectly, and there is still a problem of low grasping success rate. This study aims to propose an end-to-end grasp generation method to solve this problem. Design/methodology/approachA new grasp representation method is proposed, which cleverly uses the normal vector of the table surface to derive the grasp baseline vectors, and maps the grasps to the pointed points (PP), so that there is no need to add orthogonal constraints between vectors when using a neural network to predict rotation matrixes of grasps. FindingsExperimental results show that the proposed method is beneficial to the training of the neural network, and the model trained on synthetic data set can also have high grasping success rate and completion rate in real-world tasks. Originality/valueThe main contribution of this paper is that the authors propose a new grasp representation method, which maps the 6-DoF grasps to a PP and an angle related to the tabletop normal vector, thereby eliminating the need to add orthogonal constraints between vectors when directly predicting grasps using neural networks. The proposed method can generate hundreds of grasps covering the whole surface in about 0.3 s. The experimental results show that the proposed method has obvious superiority compared with other methods.
引用
收藏
页码:496 / 504
页数:9
相关论文
共 50 条
  • [1] 6-DOF GraspNet: Variational Grasp Generation for Object Manipulation
    Mousavian, Arsalan
    Eppner, Clemens
    Fox, Dieter
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 2901 - 2910
  • [2] Contact-GraspNet: Efficient 6-DoF Grasp Generation in Cluttered Scenes
    Sundermeyer, Martin
    Mousavian, Arsalan
    Triebel, Rudolph
    Fox, Dieter
    Proceedings - IEEE International Conference on Robotics and Automation, 2021, 2021-May : 3133 - 3139
  • [3] Contact-GraspNet: Efficient 6-DoF Grasp Generation in Cluttered Scenes
    Sundermeyer, Martin
    Mousavian, Arsalan
    Triebel, Rudolph
    Fox, Dieter
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 13438 - 13444
  • [4] Keypoint-GraspNet: Keypoint-based 6-DoF Grasp Generation from the Monocular RGB-D input
    Chen, Yiye
    Lin, Yunzhi
    Xu, Ruinian
    Vela, Patricio A.
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 7988 - 7995
  • [5] 6-DoF Contrastive Grasp Proposal Network
    Zhu, Xinghao
    Sun, Lingfeng
    Fan, Yongxiang
    Tomizuka, Masayoshi
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 6371 - 6377
  • [6] 6-DOF Grasp Detection for Unknown Objects
    Schaub, Henry
    Schoettl, Alfred
    2020 10TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTER INFORMATION TECHNOLOGIES (ACIT), 2020, : 400 - 403
  • [7] CoGrasp: 6-DoF Grasp Generation for Human-Robot Collaboration
    Keshari, Abhinav K.
    Ren, Hanwen
    Qureshi, Ahmed H.
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9829 - 9836
  • [8] Sim-Grasp: Learning 6-DOF Grasp Policies for Cluttered Environments Using a Synthetic Benchmark
    Li, Juncheng
    Cappelleri, David J.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7645 - 7652
  • [9] MTGrasp: Multiscale 6-DoF Robotic Grasp Detection
    Yu, Sheng
    Zhai, Di-Hua
    Xia, Yuanqing
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2024, : 1 - 12
  • [10] 6-DoF grasp pose estimation based on instance reconstruction
    Huiyan Han
    Wenjun Wang
    Xie Han
    Xiaowen Yang
    Intelligent Service Robotics, 2024, 17 : 251 - 264