Combining Shape Completion and Grasp Prediction for Fast and Versatile Grasping with a Multi-Fingered Hand

被引:0
|
作者
Humt, Matthias [1 ,2 ]
Winkelbauer, Dominik [1 ,2 ]
Hillenbrand, Ulrich [1 ]
Baeuml, Berthold [1 ,3 ]
机构
[1] DLR Inst Robot & Mech, Wessling, Germany
[2] Techn Univ Munich TUM, Munich, Germany
[3] Deggendorf Inst Technol, Deggendorf, Germany
关键词
D O I
10.1109/HUMANOIDS57100.2023.10375210
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Grasping objects with limited or no prior knowledge about them is a highly relevant skill in assistive robotics. Still, in this general setting, it has remained an open problem, especially when it comes to only partial observability and versatile grasping with multi-fingered hands. We present a novel, fast, and high fidelity deep learning pipeline consisting of a shape completion module that is based on a single depth image, and followed by a grasp predictor that is based on the predicted object shape. The shape completion network is based on VQDIF and predicts spatial occupancy values at arbitrary query points. As grasp predictor, we use our twostage architecture that first generates hand poses using an auto-regressive model and then regresses finger joint configurations per pose. Critical factors turn out to be sufficient data realism and augmentation, as well as special attention to difficult cases during training. Experiments on a physical robot platform demonstrate successful grasping of a wide range of household objects based on a depth image from a single viewpoint. The whole pipeline is fast, taking only about 1 s for completing the object's shape (0.7 s) and generating 1000 grasps (0.3 s).
引用
收藏
页数:8
相关论文
共 50 条
  • [41] Grasp prediction and evaluation of multi-fingered dexterous hands using deep learning
    Zhao, Zengzhi
    Shang, Weiwei
    He, Haoyuan
    Li, Zhijun
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2020, 129
  • [42] Learning-based Real-time Torque Prediction for Grasping Unknown Objects with a Multi-Fingered Hand
    Winkelbauer, Dominik
    Baeuml, Berthold
    Triebel, Rudolph
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 2979 - 2984
  • [43] A bundle trust region method for optimal grasp planning of multi-fingered robot hand
    Yin, Yingjie
    Tamura, Kazuya
    Hosoe, Shigeyuki
    2006 IEEE International Conference on Robotics and Biomimetics, Vols 1-3, 2006, : 884 - 889
  • [44] Development of a multi-fingered mechanical hand with envelope grasp capability for various shapes of parts
    Imamura, Nobuaki
    Yamaoka, Seiichi
    Shirsawa, Hidenori
    Nakamoto, Hiroyuki
    WMSCI 2006: 10TH WORLD MULTI-CONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL II, PROCEEDINGS, 2006, : 337 - +
  • [45] DEVELOPMENT OF MULTI-FINGERED ROBOTIC HAND WITH COUPLED AND DIRECTLY SELF-ADAPTIVE GRASP
    Li, Guoxuan
    Liu, Han
    Zhang, Wenzeng
    INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS, 2012, 9 (04)
  • [46] Research on novel shape memory alloy multi-fingered humanoid hand
    Yang, K.
    Gu, C. L.
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, 2007, 221 (09) : 1131 - 1140
  • [47] Autonomous control of multi-fingered hand
    JIANG Li and LIU Hong (Robotics Research Institute
    Progress in Natural Science, 2006, (05) : 531 - 537
  • [48] Hierarchical control method for manipulating/grasping tasks using multi-fingered robot hand
    Imazeki, K
    Maeno, T
    IROS 2003: PROCEEDINGS OF THE 2003 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2003, : 3686 - 3691
  • [49] Online computation of grasping force in multi-fingered hands
    Saut, JP
    Remond, C
    Perdereau, V
    Drouin, M
    2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 2918 - 2923
  • [50] Dynamic Grasping for an Arbitrary Polyhedral Object by a Multi-Fingered Hand-Arm System
    Kawamura, Akihiro
    Tahara, Kenji
    Kurazume, Ryo
    Hasegawa, Tsutomu
    2009 IEEE-RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2009, : 2264 - 2270